Oct 03 18:14:17 crc systemd[1]: Starting Kubernetes Kubelet... Oct 03 18:14:17 crc restorecon[4672]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:17 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 18:14:18 crc restorecon[4672]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 18:14:18 crc restorecon[4672]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 03 18:14:18 crc kubenswrapper[4835]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 18:14:18 crc kubenswrapper[4835]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 03 18:14:18 crc kubenswrapper[4835]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 18:14:18 crc kubenswrapper[4835]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 18:14:18 crc kubenswrapper[4835]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 03 18:14:18 crc kubenswrapper[4835]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.669641 4835 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675521 4835 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675557 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675563 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675568 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675574 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675580 4835 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675586 4835 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675592 4835 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675598 4835 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675603 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675608 4835 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675613 4835 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675618 4835 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675623 4835 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675627 4835 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675632 4835 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675636 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675653 4835 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675658 4835 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675663 4835 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675667 4835 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675672 4835 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675676 4835 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675681 4835 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675685 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675690 4835 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675695 4835 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675700 4835 feature_gate.go:330] unrecognized feature gate: Example Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675712 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675718 4835 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675723 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675728 4835 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675732 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675737 4835 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675741 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675747 4835 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675753 4835 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675758 4835 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675764 4835 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675769 4835 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675775 4835 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675781 4835 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675786 4835 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675793 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675798 4835 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675805 4835 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675810 4835 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675816 4835 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675822 4835 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675827 4835 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675833 4835 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675837 4835 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675842 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675847 4835 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675852 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675856 4835 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675861 4835 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675868 4835 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675872 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675877 4835 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675881 4835 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675886 4835 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675891 4835 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675895 4835 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675900 4835 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675904 4835 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675909 4835 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675913 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675917 4835 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675922 4835 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.675926 4835 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677036 4835 flags.go:64] FLAG: --address="0.0.0.0" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677054 4835 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677087 4835 flags.go:64] FLAG: --anonymous-auth="true" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677096 4835 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677103 4835 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677109 4835 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677117 4835 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677125 4835 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677130 4835 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677136 4835 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677142 4835 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677148 4835 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677153 4835 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677158 4835 flags.go:64] FLAG: --cgroup-root="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677163 4835 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677169 4835 flags.go:64] FLAG: --client-ca-file="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677174 4835 flags.go:64] FLAG: --cloud-config="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677180 4835 flags.go:64] FLAG: --cloud-provider="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677185 4835 flags.go:64] FLAG: --cluster-dns="[]" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677194 4835 flags.go:64] FLAG: --cluster-domain="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677199 4835 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677205 4835 flags.go:64] FLAG: --config-dir="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677210 4835 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677217 4835 flags.go:64] FLAG: --container-log-max-files="5" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677224 4835 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677231 4835 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677237 4835 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677242 4835 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677248 4835 flags.go:64] FLAG: --contention-profiling="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677253 4835 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677258 4835 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677277 4835 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677283 4835 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677291 4835 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677296 4835 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677302 4835 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677309 4835 flags.go:64] FLAG: --enable-load-reader="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677314 4835 flags.go:64] FLAG: --enable-server="true" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677320 4835 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677327 4835 flags.go:64] FLAG: --event-burst="100" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677334 4835 flags.go:64] FLAG: --event-qps="50" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677349 4835 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677354 4835 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677360 4835 flags.go:64] FLAG: --eviction-hard="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677367 4835 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677372 4835 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677378 4835 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677384 4835 flags.go:64] FLAG: --eviction-soft="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677390 4835 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677395 4835 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677400 4835 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677405 4835 flags.go:64] FLAG: --experimental-mounter-path="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677411 4835 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677416 4835 flags.go:64] FLAG: --fail-swap-on="true" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677432 4835 flags.go:64] FLAG: --feature-gates="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677440 4835 flags.go:64] FLAG: --file-check-frequency="20s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677446 4835 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677453 4835 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677458 4835 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677464 4835 flags.go:64] FLAG: --healthz-port="10248" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677469 4835 flags.go:64] FLAG: --help="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677475 4835 flags.go:64] FLAG: --hostname-override="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677480 4835 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677485 4835 flags.go:64] FLAG: --http-check-frequency="20s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677491 4835 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677496 4835 flags.go:64] FLAG: --image-credential-provider-config="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677501 4835 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677507 4835 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677512 4835 flags.go:64] FLAG: --image-service-endpoint="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677517 4835 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677522 4835 flags.go:64] FLAG: --kube-api-burst="100" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677527 4835 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677533 4835 flags.go:64] FLAG: --kube-api-qps="50" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677538 4835 flags.go:64] FLAG: --kube-reserved="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677543 4835 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677548 4835 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677556 4835 flags.go:64] FLAG: --kubelet-cgroups="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677561 4835 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677567 4835 flags.go:64] FLAG: --lock-file="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677572 4835 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677578 4835 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677583 4835 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677592 4835 flags.go:64] FLAG: --log-json-split-stream="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677598 4835 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677603 4835 flags.go:64] FLAG: --log-text-split-stream="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677609 4835 flags.go:64] FLAG: --logging-format="text" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677614 4835 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677620 4835 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677625 4835 flags.go:64] FLAG: --manifest-url="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677631 4835 flags.go:64] FLAG: --manifest-url-header="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677638 4835 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677644 4835 flags.go:64] FLAG: --max-open-files="1000000" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677651 4835 flags.go:64] FLAG: --max-pods="110" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677656 4835 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677662 4835 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677667 4835 flags.go:64] FLAG: --memory-manager-policy="None" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677673 4835 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677679 4835 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677684 4835 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677690 4835 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677703 4835 flags.go:64] FLAG: --node-status-max-images="50" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677708 4835 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677714 4835 flags.go:64] FLAG: --oom-score-adj="-999" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677719 4835 flags.go:64] FLAG: --pod-cidr="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677725 4835 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677735 4835 flags.go:64] FLAG: --pod-manifest-path="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677740 4835 flags.go:64] FLAG: --pod-max-pids="-1" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677746 4835 flags.go:64] FLAG: --pods-per-core="0" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677752 4835 flags.go:64] FLAG: --port="10250" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677758 4835 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677763 4835 flags.go:64] FLAG: --provider-id="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677769 4835 flags.go:64] FLAG: --qos-reserved="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677783 4835 flags.go:64] FLAG: --read-only-port="10255" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677788 4835 flags.go:64] FLAG: --register-node="true" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677794 4835 flags.go:64] FLAG: --register-schedulable="true" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677800 4835 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677816 4835 flags.go:64] FLAG: --registry-burst="10" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677822 4835 flags.go:64] FLAG: --registry-qps="5" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677827 4835 flags.go:64] FLAG: --reserved-cpus="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677834 4835 flags.go:64] FLAG: --reserved-memory="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677842 4835 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677847 4835 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677854 4835 flags.go:64] FLAG: --rotate-certificates="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677860 4835 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677866 4835 flags.go:64] FLAG: --runonce="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677871 4835 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677877 4835 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677882 4835 flags.go:64] FLAG: --seccomp-default="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677888 4835 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677893 4835 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677899 4835 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677905 4835 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677911 4835 flags.go:64] FLAG: --storage-driver-password="root" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677916 4835 flags.go:64] FLAG: --storage-driver-secure="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677921 4835 flags.go:64] FLAG: --storage-driver-table="stats" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677927 4835 flags.go:64] FLAG: --storage-driver-user="root" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677932 4835 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677938 4835 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677944 4835 flags.go:64] FLAG: --system-cgroups="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677949 4835 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677958 4835 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677964 4835 flags.go:64] FLAG: --tls-cert-file="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677969 4835 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677978 4835 flags.go:64] FLAG: --tls-min-version="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677983 4835 flags.go:64] FLAG: --tls-private-key-file="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677989 4835 flags.go:64] FLAG: --topology-manager-policy="none" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.677994 4835 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.678001 4835 flags.go:64] FLAG: --topology-manager-scope="container" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.678008 4835 flags.go:64] FLAG: --v="2" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.678017 4835 flags.go:64] FLAG: --version="false" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.678026 4835 flags.go:64] FLAG: --vmodule="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.678037 4835 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.678045 4835 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678199 4835 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678208 4835 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678215 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678220 4835 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678224 4835 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678230 4835 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678235 4835 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678240 4835 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678245 4835 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678249 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678254 4835 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678258 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678263 4835 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678267 4835 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678272 4835 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678277 4835 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678281 4835 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678285 4835 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678290 4835 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678294 4835 feature_gate.go:330] unrecognized feature gate: Example Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678298 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678303 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678307 4835 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678313 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678317 4835 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678321 4835 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678326 4835 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678330 4835 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678334 4835 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678339 4835 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678343 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678354 4835 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678359 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678364 4835 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678369 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678375 4835 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678381 4835 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678386 4835 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678390 4835 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678395 4835 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678401 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678405 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678410 4835 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678415 4835 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678420 4835 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678424 4835 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678428 4835 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678433 4835 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678437 4835 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678444 4835 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678450 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678455 4835 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678460 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678465 4835 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678470 4835 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678475 4835 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678479 4835 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678484 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678490 4835 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678545 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678554 4835 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678560 4835 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678565 4835 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678572 4835 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678583 4835 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678590 4835 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678595 4835 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678601 4835 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678606 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678611 4835 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.678616 4835 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.678633 4835 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.686738 4835 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.686782 4835 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686850 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686858 4835 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686876 4835 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686881 4835 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686886 4835 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686891 4835 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686895 4835 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686900 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686904 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686908 4835 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686911 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686915 4835 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686919 4835 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686922 4835 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686927 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686931 4835 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686934 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686938 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686942 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686947 4835 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686952 4835 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686956 4835 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686960 4835 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686964 4835 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686968 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686972 4835 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686975 4835 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686979 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686984 4835 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686989 4835 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686993 4835 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.686997 4835 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687001 4835 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687006 4835 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687014 4835 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687019 4835 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687022 4835 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687026 4835 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687030 4835 feature_gate.go:330] unrecognized feature gate: Example Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687035 4835 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687040 4835 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687044 4835 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687048 4835 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687052 4835 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687056 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687060 4835 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687064 4835 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687091 4835 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687095 4835 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687099 4835 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687104 4835 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687110 4835 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687114 4835 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687118 4835 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687123 4835 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687128 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687132 4835 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687136 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687140 4835 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687144 4835 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687148 4835 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687151 4835 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687156 4835 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687161 4835 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687165 4835 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687169 4835 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687173 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687177 4835 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687180 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687184 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687189 4835 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.687195 4835 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687323 4835 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687332 4835 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687338 4835 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687344 4835 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687348 4835 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687352 4835 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687355 4835 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687359 4835 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687363 4835 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687367 4835 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687371 4835 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687375 4835 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687378 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687382 4835 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687386 4835 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687390 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687393 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687397 4835 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687401 4835 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687405 4835 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687408 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687413 4835 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687417 4835 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687422 4835 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687426 4835 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687430 4835 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687435 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687438 4835 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687442 4835 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687446 4835 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687450 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687453 4835 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687457 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687461 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687465 4835 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687469 4835 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687472 4835 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687476 4835 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687479 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687484 4835 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687489 4835 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687493 4835 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687497 4835 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687501 4835 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687506 4835 feature_gate.go:330] unrecognized feature gate: Example Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687510 4835 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687513 4835 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687518 4835 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687521 4835 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687525 4835 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687529 4835 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687533 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687537 4835 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687541 4835 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687544 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687548 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687552 4835 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687556 4835 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687560 4835 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687564 4835 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687568 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687571 4835 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687575 4835 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687578 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687582 4835 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687587 4835 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687591 4835 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687594 4835 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687598 4835 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687601 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.687605 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.687612 4835 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.687785 4835 server.go:940] "Client rotation is on, will bootstrap in background" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.691660 4835 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.691739 4835 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.693192 4835 server.go:997] "Starting client certificate rotation" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.693220 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.694525 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-09 16:05:10.446001687 +0000 UTC Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.694597 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 885h50m51.751418418s for next certificate rotation Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.721487 4835 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.723864 4835 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.741651 4835 log.go:25] "Validated CRI v1 runtime API" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.774708 4835 log.go:25] "Validated CRI v1 image API" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.776300 4835 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.781907 4835 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-03-18-09-32-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.781935 4835 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.795421 4835 manager.go:217] Machine: {Timestamp:2025-10-03 18:14:18.79396982 +0000 UTC m=+0.509910702 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5536f758-9b73-4d0a-adbf-baceea025860 BootID:09a7fe10-d48b-4c2b-a983-f4d4d5c8e340 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:5c:a2:43 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:5c:a2:43 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:be:4c:67 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:49:7e:af Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:26:71:73 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:43:b1:0f Speed:-1 Mtu:1496} {Name:eth10 MacAddress:de:b0:82:e5:ab:85 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fe:5a:10:5b:f6:df Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.795634 4835 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.795728 4835 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.797359 4835 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.797519 4835 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.797550 4835 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.797741 4835 topology_manager.go:138] "Creating topology manager with none policy" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.797749 4835 container_manager_linux.go:303] "Creating device plugin manager" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.798212 4835 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.798238 4835 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.798416 4835 state_mem.go:36] "Initialized new in-memory state store" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.798500 4835 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.801800 4835 kubelet.go:418] "Attempting to sync node with API server" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.801824 4835 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.801852 4835 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.801867 4835 kubelet.go:324] "Adding apiserver pod source" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.801881 4835 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.805852 4835 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.807002 4835 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.809117 4835 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.810602 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 18:14:18 crc kubenswrapper[4835]: E1003 18:14:18.810717 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.810804 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 18:14:18 crc kubenswrapper[4835]: E1003 18:14:18.810877 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.812345 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.812381 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.812392 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.812401 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.812416 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.812428 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.812438 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.812453 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.812465 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.812476 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.812512 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.812522 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.812552 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.813000 4835 server.go:1280] "Started kubelet" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.813244 4835 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.813469 4835 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.813746 4835 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.814089 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 18:14:18 crc systemd[1]: Started Kubernetes Kubelet. Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.815322 4835 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.815372 4835 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.815462 4835 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 12:59:33.424776371 +0000 UTC Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.815492 4835 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1242h45m14.60928607s for next certificate rotation Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.815522 4835 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.815529 4835 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.815603 4835 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 03 18:14:18 crc kubenswrapper[4835]: E1003 18:14:18.815763 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.816586 4835 factory.go:55] Registering systemd factory Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.816630 4835 factory.go:221] Registration of the systemd container factory successfully Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.816921 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 18:14:18 crc kubenswrapper[4835]: E1003 18:14:18.817046 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 18:14:18 crc kubenswrapper[4835]: E1003 18:14:18.817196 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="200ms" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.817339 4835 server.go:460] "Adding debug handlers to kubelet server" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.817337 4835 factory.go:153] Registering CRI-O factory Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.817659 4835 factory.go:221] Registration of the crio container factory successfully Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.817807 4835 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.817903 4835 factory.go:103] Registering Raw factory Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.817994 4835 manager.go:1196] Started watching for new ooms in manager Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.818602 4835 manager.go:319] Starting recovery of all containers Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.827992 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828093 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828108 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828122 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828135 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828151 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828162 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828172 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828191 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828201 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828216 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828229 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828247 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828262 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828280 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828293 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828306 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828341 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828357 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828376 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828395 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828408 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828426 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828442 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828463 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828485 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828508 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828529 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828546 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828561 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828578 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828591 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828607 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828622 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828636 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828650 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828663 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828677 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828695 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828707 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828721 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828738 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828751 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828767 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828780 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828797 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828811 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828823 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828837 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828850 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828867 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828882 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828910 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828932 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828947 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828965 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828980 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.828996 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829010 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829029 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829041 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829053 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829118 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829132 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829150 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829165 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829179 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829196 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829209 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829223 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829235 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829246 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829261 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829273 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829285 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829298 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829310 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829326 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829337 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829394 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829766 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829851 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829891 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829917 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829949 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829971 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.829994 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.830810 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.830859 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.830892 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: E1003 18:14:18.827729 4835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.173:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186b0dce3a6d2f42 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 18:14:18.81296877 +0000 UTC m=+0.528909642,LastTimestamp:2025-10-03 18:14:18.81296877 +0000 UTC m=+0.528909642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831634 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831667 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831689 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831702 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831717 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831730 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831739 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831753 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831764 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831776 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831787 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831799 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831812 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831825 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831847 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831866 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831884 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831901 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831915 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831932 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831948 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831960 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831978 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.831992 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832003 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832016 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832030 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832041 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832056 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832090 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832108 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832118 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832131 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832146 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832434 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832452 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832464 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832479 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832495 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832506 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832517 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832532 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832543 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832557 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832568 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832580 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832595 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832607 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832620 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.832634 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.836270 4835 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.836477 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.836509 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.836540 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.836564 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.836590 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.836611 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.838534 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.838632 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.838701 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.838841 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.838924 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.838997 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839057 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839135 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839194 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839250 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839306 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839370 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839428 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839488 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839548 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839609 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839673 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839731 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839828 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839895 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.839954 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.840020 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.840108 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.840176 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.840234 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.840291 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.840412 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.840492 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.840550 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.840607 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.841544 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.841621 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.841696 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.841756 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.841823 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.841882 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842004 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842084 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842144 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842201 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842254 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842308 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842372 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842431 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842487 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842544 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842598 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842654 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842718 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842798 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842857 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842913 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.842967 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.843026 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.843098 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.843158 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.843217 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.843287 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.843363 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.843422 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.843479 4835 manager.go:324] Recovery completed Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.843482 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.843573 4835 reconstruct.go:97] "Volume reconstruction finished" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.843585 4835 reconciler.go:26] "Reconciler: start to sync state" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.853152 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.854858 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.854890 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.854902 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.855710 4835 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.855779 4835 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.855858 4835 state_mem.go:36] "Initialized new in-memory state store" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.872942 4835 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.875561 4835 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.875602 4835 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.875630 4835 kubelet.go:2335] "Starting kubelet main sync loop" Oct 03 18:14:18 crc kubenswrapper[4835]: E1003 18:14:18.875676 4835 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.877011 4835 policy_none.go:49] "None policy: Start" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.878663 4835 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.878703 4835 state_mem.go:35] "Initializing new in-memory state store" Oct 03 18:14:18 crc kubenswrapper[4835]: W1003 18:14:18.878861 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 18:14:18 crc kubenswrapper[4835]: E1003 18:14:18.878962 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 18:14:18 crc kubenswrapper[4835]: E1003 18:14:18.916950 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.951032 4835 manager.go:334] "Starting Device Plugin manager" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.952501 4835 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.952547 4835 server.go:79] "Starting device plugin registration server" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.953038 4835 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.953058 4835 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.953170 4835 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.953271 4835 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.953284 4835 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 03 18:14:18 crc kubenswrapper[4835]: E1003 18:14:18.959740 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.976004 4835 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.976162 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.977272 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.977302 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.977317 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.977438 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.977561 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.977592 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.978051 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.978095 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.978106 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.978190 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.978315 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.978347 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.978915 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.978941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.978951 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.979079 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.979112 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.979125 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.979285 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.979411 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.979456 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.980157 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.980180 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.980188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.980491 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.980512 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.980548 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.980571 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.980556 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.980582 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.980749 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.980871 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.981012 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.981708 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.981728 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.981737 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.982229 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.982264 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.982278 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.982632 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.982664 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.983345 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.983374 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:18 crc kubenswrapper[4835]: I1003 18:14:18.983386 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:19 crc kubenswrapper[4835]: E1003 18:14:19.017757 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="400ms" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.046107 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047031 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047115 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047148 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047178 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047212 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047295 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047379 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047448 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047509 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047567 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047609 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047666 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047712 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.047743 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.053378 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.054461 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.054497 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.054508 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.054535 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 18:14:19 crc kubenswrapper[4835]: E1003 18:14:19.054865 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148731 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148811 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148832 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148847 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148863 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148884 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148898 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148914 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148930 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148946 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148945 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148959 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148958 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149000 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.148973 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149037 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149043 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149088 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149096 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149092 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149120 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149151 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149175 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149197 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149232 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149131 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149251 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149299 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149315 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.149425 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.255964 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.257161 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.257194 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.257202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.257224 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 18:14:19 crc kubenswrapper[4835]: E1003 18:14:19.257490 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.316129 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.332001 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.337845 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.355844 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.360110 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 18:14:19 crc kubenswrapper[4835]: W1003 18:14:19.368577 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-96b94669970f2ba5baba394dfa8ece9f46cfc13a6379687a7bd7b2aae7061cc0 WatchSource:0}: Error finding container 96b94669970f2ba5baba394dfa8ece9f46cfc13a6379687a7bd7b2aae7061cc0: Status 404 returned error can't find the container with id 96b94669970f2ba5baba394dfa8ece9f46cfc13a6379687a7bd7b2aae7061cc0 Oct 03 18:14:19 crc kubenswrapper[4835]: W1003 18:14:19.370919 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-dadfb07aadabf6bd658c3d2fa5b0fceaa8c04d6a9c3b4208b8abf4c77026aba7 WatchSource:0}: Error finding container dadfb07aadabf6bd658c3d2fa5b0fceaa8c04d6a9c3b4208b8abf4c77026aba7: Status 404 returned error can't find the container with id dadfb07aadabf6bd658c3d2fa5b0fceaa8c04d6a9c3b4208b8abf4c77026aba7 Oct 03 18:14:19 crc kubenswrapper[4835]: W1003 18:14:19.376030 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-a4a7cb7b4a186412c9cbd2eb0339db17e58060967eb6b22dedc870398510837c WatchSource:0}: Error finding container a4a7cb7b4a186412c9cbd2eb0339db17e58060967eb6b22dedc870398510837c: Status 404 returned error can't find the container with id a4a7cb7b4a186412c9cbd2eb0339db17e58060967eb6b22dedc870398510837c Oct 03 18:14:19 crc kubenswrapper[4835]: W1003 18:14:19.382506 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-8de389382e7a6363942345de13591ba9dd0611433487da6b40b6fa2eecf342a3 WatchSource:0}: Error finding container 8de389382e7a6363942345de13591ba9dd0611433487da6b40b6fa2eecf342a3: Status 404 returned error can't find the container with id 8de389382e7a6363942345de13591ba9dd0611433487da6b40b6fa2eecf342a3 Oct 03 18:14:19 crc kubenswrapper[4835]: W1003 18:14:19.384845 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-df21594545c5e13f979723d2c145bf28cb69f8c6ccb22062deb2a0ec7da0a2bf WatchSource:0}: Error finding container df21594545c5e13f979723d2c145bf28cb69f8c6ccb22062deb2a0ec7da0a2bf: Status 404 returned error can't find the container with id df21594545c5e13f979723d2c145bf28cb69f8c6ccb22062deb2a0ec7da0a2bf Oct 03 18:14:19 crc kubenswrapper[4835]: E1003 18:14:19.419341 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="800ms" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.657933 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.659593 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.659635 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.659649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.659684 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 18:14:19 crc kubenswrapper[4835]: E1003 18:14:19.660140 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.815447 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 18:14:19 crc kubenswrapper[4835]: W1003 18:14:19.868388 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 18:14:19 crc kubenswrapper[4835]: E1003 18:14:19.868481 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.881604 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8de389382e7a6363942345de13591ba9dd0611433487da6b40b6fa2eecf342a3"} Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.882996 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a4a7cb7b4a186412c9cbd2eb0339db17e58060967eb6b22dedc870398510837c"} Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.884513 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dadfb07aadabf6bd658c3d2fa5b0fceaa8c04d6a9c3b4208b8abf4c77026aba7"} Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.885421 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"96b94669970f2ba5baba394dfa8ece9f46cfc13a6379687a7bd7b2aae7061cc0"} Oct 03 18:14:19 crc kubenswrapper[4835]: I1003 18:14:19.889496 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"df21594545c5e13f979723d2c145bf28cb69f8c6ccb22062deb2a0ec7da0a2bf"} Oct 03 18:14:19 crc kubenswrapper[4835]: W1003 18:14:19.896007 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 18:14:19 crc kubenswrapper[4835]: E1003 18:14:19.896099 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 18:14:20 crc kubenswrapper[4835]: E1003 18:14:20.220737 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="1.6s" Oct 03 18:14:20 crc kubenswrapper[4835]: W1003 18:14:20.359633 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 18:14:20 crc kubenswrapper[4835]: E1003 18:14:20.359706 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 18:14:20 crc kubenswrapper[4835]: W1003 18:14:20.373304 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 18:14:20 crc kubenswrapper[4835]: E1003 18:14:20.373370 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.460279 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.461814 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.461848 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.461857 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.461879 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 18:14:20 crc kubenswrapper[4835]: E1003 18:14:20.462364 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.815325 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.896284 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43"} Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.896335 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11"} Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.896345 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88"} Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.896354 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a"} Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.896436 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.897566 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.897592 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.897601 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.899472 4835 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60" exitCode=0 Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.899524 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60"} Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.899576 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.900210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.900232 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.900241 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.902402 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063" exitCode=0 Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.902444 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063"} Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.902532 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.903770 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.903858 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.903887 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.904848 4835 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e" exitCode=0 Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.904881 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e"} Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.905053 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.908247 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.908300 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.908316 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.909295 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.909843 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.909866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.909875 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.909922 4835 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf" exitCode=0 Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.909959 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf"} Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.910109 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.910800 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.910855 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.910866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:20 crc kubenswrapper[4835]: I1003 18:14:20.982362 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:21 crc kubenswrapper[4835]: E1003 18:14:21.465111 4835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.173:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186b0dce3a6d2f42 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 18:14:18.81296877 +0000 UTC m=+0.528909642,LastTimestamp:2025-10-03 18:14:18.81296877 +0000 UTC m=+0.528909642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 18:14:21 crc kubenswrapper[4835]: W1003 18:14:21.590096 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 18:14:21 crc kubenswrapper[4835]: E1003 18:14:21.590197 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.815100 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 18:14:21 crc kubenswrapper[4835]: E1003 18:14:21.822892 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="3.2s" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.914570 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.914565 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4b6bb20070b73452498da6c6a6f79e01551a0d203cc7d85f39bc13b9e68482be"} Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.915410 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.915435 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.915444 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.917478 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3"} Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.917519 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.917525 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c"} Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.917538 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a"} Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.917547 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04"} Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.917556 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25"} Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.917998 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.918018 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.918025 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.919502 4835 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046" exitCode=0 Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.919550 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046"} Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.919630 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.920383 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.920402 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.920410 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.923028 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4aa63760f1ef0079ef65f47e141a5938e22007e5f3bf41a1e61491359b1eb0c4"} Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.923078 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.923100 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.923096 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"27b9902dcf4b7c66e119168c3b3eb90f437ca1e723186a43c648d19f4101b851"} Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.923248 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9e35992e79cb51b47dd78356feedcca634b95f0fbd0aac49017b88d555dab225"} Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.923836 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.923884 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.923892 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.923989 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.924024 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:21 crc kubenswrapper[4835]: I1003 18:14:21.924036 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.062486 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.063693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.063732 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.063744 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.063769 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 18:14:22 crc kubenswrapper[4835]: E1003 18:14:22.064224 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.547673 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.927944 4835 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b" exitCode=0 Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.928011 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b"} Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.928055 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.928063 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.928103 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.928145 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.928154 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.928563 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.928600 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929458 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929486 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929497 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929498 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929522 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929532 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929501 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929561 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929490 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929593 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929602 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929498 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929641 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929568 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:22 crc kubenswrapper[4835]: I1003 18:14:22.929651 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.637208 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.645768 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.934572 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5"} Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.934622 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.934636 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.934623 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e"} Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.934736 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82"} Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.934753 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1"} Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.934764 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0"} Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.935832 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.935863 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.935872 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.935904 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.935927 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:23 crc kubenswrapper[4835]: I1003 18:14:23.935937 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:24 crc kubenswrapper[4835]: I1003 18:14:24.936352 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:24 crc kubenswrapper[4835]: I1003 18:14:24.936409 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:24 crc kubenswrapper[4835]: I1003 18:14:24.937228 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:24 crc kubenswrapper[4835]: I1003 18:14:24.937261 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:24 crc kubenswrapper[4835]: I1003 18:14:24.937276 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:24 crc kubenswrapper[4835]: I1003 18:14:24.937472 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:24 crc kubenswrapper[4835]: I1003 18:14:24.937524 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:24 crc kubenswrapper[4835]: I1003 18:14:24.937537 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:25 crc kubenswrapper[4835]: I1003 18:14:25.111164 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:25 crc kubenswrapper[4835]: I1003 18:14:25.111298 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 18:14:25 crc kubenswrapper[4835]: I1003 18:14:25.111332 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:25 crc kubenswrapper[4835]: I1003 18:14:25.112322 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:25 crc kubenswrapper[4835]: I1003 18:14:25.112369 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:25 crc kubenswrapper[4835]: I1003 18:14:25.112381 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:25 crc kubenswrapper[4835]: I1003 18:14:25.264605 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:25 crc kubenswrapper[4835]: I1003 18:14:25.265763 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:25 crc kubenswrapper[4835]: I1003 18:14:25.265800 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:25 crc kubenswrapper[4835]: I1003 18:14:25.265811 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:25 crc kubenswrapper[4835]: I1003 18:14:25.265830 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 18:14:26 crc kubenswrapper[4835]: I1003 18:14:26.875106 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:26 crc kubenswrapper[4835]: I1003 18:14:26.875239 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 18:14:26 crc kubenswrapper[4835]: I1003 18:14:26.875283 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:26 crc kubenswrapper[4835]: I1003 18:14:26.876435 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:26 crc kubenswrapper[4835]: I1003 18:14:26.876504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:26 crc kubenswrapper[4835]: I1003 18:14:26.876520 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.182994 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.183185 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.184093 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.184120 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.184130 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.631868 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.632060 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.633037 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.633094 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.633107 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.862932 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.863111 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.864151 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.864205 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:27 crc kubenswrapper[4835]: I1003 18:14:27.864216 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:28 crc kubenswrapper[4835]: I1003 18:14:28.157733 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:28 crc kubenswrapper[4835]: I1003 18:14:28.157885 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:28 crc kubenswrapper[4835]: I1003 18:14:28.159138 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:28 crc kubenswrapper[4835]: I1003 18:14:28.159173 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:28 crc kubenswrapper[4835]: I1003 18:14:28.159186 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:28 crc kubenswrapper[4835]: E1003 18:14:28.959853 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 18:14:30 crc kubenswrapper[4835]: I1003 18:14:30.676161 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 03 18:14:30 crc kubenswrapper[4835]: I1003 18:14:30.676349 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:30 crc kubenswrapper[4835]: I1003 18:14:30.677935 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:30 crc kubenswrapper[4835]: I1003 18:14:30.677975 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:30 crc kubenswrapper[4835]: I1003 18:14:30.677990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:31 crc kubenswrapper[4835]: I1003 18:14:31.158015 4835 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 18:14:31 crc kubenswrapper[4835]: I1003 18:14:31.158149 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 18:14:32 crc kubenswrapper[4835]: I1003 18:14:32.148118 4835 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 18:14:32 crc kubenswrapper[4835]: I1003 18:14:32.148222 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 18:14:32 crc kubenswrapper[4835]: I1003 18:14:32.152990 4835 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 18:14:32 crc kubenswrapper[4835]: I1003 18:14:32.153053 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 18:14:32 crc kubenswrapper[4835]: I1003 18:14:32.553235 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:32 crc kubenswrapper[4835]: I1003 18:14:32.553406 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:32 crc kubenswrapper[4835]: I1003 18:14:32.554480 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:32 crc kubenswrapper[4835]: I1003 18:14:32.554519 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:32 crc kubenswrapper[4835]: I1003 18:14:32.554573 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.880010 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.880250 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.880554 4835 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.880593 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.881270 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.881297 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.881337 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.884230 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.960949 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.961365 4835 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.961437 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.961733 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.961763 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:36 crc kubenswrapper[4835]: I1003 18:14:36.961773 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.143408 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.146285 4835 trace.go:236] Trace[1564535883]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 18:14:22.545) (total time: 14600ms): Oct 03 18:14:37 crc kubenswrapper[4835]: Trace[1564535883]: ---"Objects listed" error: 14599ms (18:14:37.145) Oct 03 18:14:37 crc kubenswrapper[4835]: Trace[1564535883]: [14.60023791s] [14.60023791s] END Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.146357 4835 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.146330 4835 trace.go:236] Trace[726143706]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 18:14:22.567) (total time: 14578ms): Oct 03 18:14:37 crc kubenswrapper[4835]: Trace[726143706]: ---"Objects listed" error: 14578ms (18:14:37.146) Oct 03 18:14:37 crc kubenswrapper[4835]: Trace[726143706]: [14.578332749s] [14.578332749s] END Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.147620 4835 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.146942 4835 trace.go:236] Trace[501985992]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 18:14:26.666) (total time: 10480ms): Oct 03 18:14:37 crc kubenswrapper[4835]: Trace[501985992]: ---"Objects listed" error: 10480ms (18:14:37.146) Oct 03 18:14:37 crc kubenswrapper[4835]: Trace[501985992]: [10.48027455s] [10.48027455s] END Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.147914 4835 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.152883 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.153323 4835 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.155049 4835 trace.go:236] Trace[895982645]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 18:14:22.252) (total time: 14902ms): Oct 03 18:14:37 crc kubenswrapper[4835]: Trace[895982645]: ---"Objects listed" error: 14901ms (18:14:37.154) Oct 03 18:14:37 crc kubenswrapper[4835]: Trace[895982645]: [14.902101125s] [14.902101125s] END Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.155097 4835 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.657294 4835 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56990->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.657358 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56990->192.168.126.11:17697: read: connection reset by peer" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.812940 4835 apiserver.go:52] "Watching apiserver" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.816099 4835 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.816447 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.816787 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.816876 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.816895 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.817159 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.817442 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.817882 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.817984 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.818037 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.818112 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.818121 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.818545 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.818564 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.818555 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.819221 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.819471 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.819523 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.819927 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.821087 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.844412 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.855198 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.862381 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.869392 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.878618 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.887935 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.896402 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.917006 4835 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.956998 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957035 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957059 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957101 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957124 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957145 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957165 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957188 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957209 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957229 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957251 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957274 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957298 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957322 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957345 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957366 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957391 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957415 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957437 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957638 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957666 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957689 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957709 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957729 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957750 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957770 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957788 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957807 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957827 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957847 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957867 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957889 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957910 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.957931 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958046 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958051 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958101 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958137 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958172 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958151 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958155 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958290 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958301 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958306 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958391 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958414 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958465 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958492 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958503 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958514 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958580 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958598 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958628 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958648 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958650 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958666 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958687 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958706 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958731 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958722 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958773 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958790 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958878 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958907 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958922 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958939 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958956 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958973 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958991 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959006 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959021 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959036 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959051 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959110 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959129 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959154 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959169 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959195 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959210 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959225 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959243 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959266 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959288 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959308 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959324 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959340 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959357 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959382 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959400 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959419 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959468 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959484 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959499 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959517 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959532 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959550 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959568 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959584 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959599 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960435 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960472 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960488 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960503 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960519 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960536 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960555 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960573 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960592 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960607 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960624 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960640 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960657 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960673 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958786 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958797 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958806 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958947 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.958954 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959014 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959130 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959249 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959536 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959660 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959687 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.959726 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960018 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960057 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960184 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960204 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960218 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960308 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960454 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960501 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960549 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960561 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960621 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.960694 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.960935 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:14:38.460684794 +0000 UTC m=+20.176625776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961260 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961285 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961310 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961318 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961355 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961375 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961399 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961418 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961437 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961460 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961486 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961504 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961521 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961545 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961570 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961934 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961951 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961967 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961982 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961997 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962094 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962120 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962147 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963144 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963181 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963205 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963228 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963250 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963271 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963566 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963989 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964018 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964035 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964053 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964091 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964114 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964138 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964159 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964183 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964207 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964275 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964299 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964353 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964379 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964404 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964432 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964455 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964478 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964501 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964523 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964547 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964570 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964590 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964605 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964620 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964635 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964651 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964666 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964681 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964698 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964722 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964746 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964767 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964787 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964808 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964885 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964909 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964926 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964944 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964966 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964991 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965013 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965029 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965052 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965095 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965121 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965144 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965172 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965200 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965225 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965249 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965275 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965298 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965320 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965344 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965368 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965391 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965411 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965433 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965454 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965476 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965501 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965526 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965553 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965578 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965601 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965623 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965668 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965697 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965725 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965754 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965780 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965801 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965827 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965906 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965932 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965953 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965976 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966006 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966034 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966058 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966138 4835 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966155 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966168 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966182 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966193 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966202 4835 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966212 4835 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966227 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966240 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966254 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966266 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966279 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966291 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966304 4835 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966318 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966332 4835 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966345 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966359 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966372 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966385 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966398 4835 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966410 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966422 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966471 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966486 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966500 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966513 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966526 4835 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966539 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966554 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966567 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966580 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966593 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966606 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966619 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966632 4835 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966651 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966664 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966676 4835 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966944 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.967894 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961455 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961484 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961168 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961688 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.961755 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962188 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962201 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962463 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962501 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962599 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962636 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962650 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962652 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962689 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962706 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.962763 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963300 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.970783 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963332 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963349 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963582 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963660 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.963913 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964202 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964314 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964472 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964528 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964785 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964957 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.964959 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965009 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965035 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965263 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965345 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965453 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965520 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965525 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965640 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965696 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.971040 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965801 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965813 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.965862 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966133 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966244 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966289 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966297 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966592 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.971128 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966636 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966929 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.966952 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.967027 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.967051 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.967320 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.967330 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.967414 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.967481 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.967626 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.967952 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.968036 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.968386 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.968453 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.968601 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.968634 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.968812 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.968820 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.968870 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.968936 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.969419 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.969792 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.969797 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.970020 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.970147 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.970266 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.970271 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.970523 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.971136 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.971202 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.971461 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.971642 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.971932 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.972040 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.972116 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.972469 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.972647 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.972668 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.972676 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.972689 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.972740 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.972878 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.972904 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.973220 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.973228 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.973250 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.973415 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.973508 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.973549 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.973709 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.974052 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.974368 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.974555 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.974470 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.974371 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.974515 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.974060 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.974640 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.974652 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.974824 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.974908 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.975090 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3" exitCode=255 Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.975135 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3"} Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.975136 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.975149 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.975210 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.977191 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.977321 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.977356 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.977431 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.977522 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.975717 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.975846 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.975978 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.975992 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.976022 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.976063 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.976164 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.976180 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.976499 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.976536 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.976601 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.976686 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.976758 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.976773 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.977955 4835 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.977997 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.977980 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:38.477955764 +0000 UTC m=+20.193896696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.978017 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.978117 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:38.478106098 +0000 UTC m=+20.194046970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.978127 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.978319 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.978614 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.979165 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.979730 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.980148 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.980333 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.981260 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.981352 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.981753 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.981851 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.982329 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.982485 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.984860 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.986408 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.986510 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.990276 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.991568 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.991668 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.991751 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.991998 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.992038 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.992049 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.992109 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:38.492094012 +0000 UTC m=+20.208034884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.992454 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.992532 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.992721 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.995235 4835 scope.go:117] "RemoveContainer" containerID="a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.995292 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.996156 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.997515 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.997985 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.998848 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.998881 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.998897 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:37 crc kubenswrapper[4835]: E1003 18:14:37.998950 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:38.498933251 +0000 UTC m=+20.214874123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:37 crc kubenswrapper[4835]: I1003 18:14:37.999377 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.001268 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.003714 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.015750 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.018267 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.019688 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.025157 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.031278 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.041913 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.052933 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067432 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067487 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067556 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067571 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067582 4835 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067593 4835 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067604 4835 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067616 4835 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067627 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067592 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067637 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067752 4835 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067773 4835 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067783 4835 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067793 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067805 4835 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067816 4835 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067825 4835 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067833 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067843 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067852 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067860 4835 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067567 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067872 4835 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067909 4835 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067921 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067930 4835 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067939 4835 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067948 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067956 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067964 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067975 4835 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067984 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.067992 4835 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068002 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068010 4835 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068019 4835 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068027 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068035 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068042 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068050 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068058 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068081 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068092 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068101 4835 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068109 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068117 4835 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068125 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068133 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068141 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068148 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068156 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068165 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068174 4835 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068182 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068189 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068197 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068205 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068212 4835 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068220 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068228 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068236 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068244 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068252 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068259 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068267 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068275 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068283 4835 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068290 4835 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068298 4835 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068305 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068313 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068320 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068328 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068336 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068343 4835 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068351 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068358 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068366 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068375 4835 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068384 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068391 4835 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068399 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068408 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068417 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068425 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068433 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068441 4835 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068449 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068459 4835 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068466 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068474 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068482 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068490 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068498 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068507 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068515 4835 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068523 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068530 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068538 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068546 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068554 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068562 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068570 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068580 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068587 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068595 4835 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068603 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068611 4835 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068619 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068627 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068635 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068643 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068651 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068660 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068669 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068676 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068684 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068692 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068700 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068708 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068716 4835 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068725 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068734 4835 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068742 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068749 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068757 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068765 4835 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068772 4835 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068780 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068788 4835 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068798 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068808 4835 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068816 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068825 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068833 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068841 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068848 4835 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068857 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068865 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068883 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068891 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068899 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068907 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068915 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068923 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068931 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068939 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068947 4835 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068955 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068962 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068971 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068979 4835 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068987 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.068995 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.069002 4835 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.069011 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.069019 4835 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.069028 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.129947 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.137622 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 18:14:38 crc kubenswrapper[4835]: W1003 18:14:38.140564 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-ca6eb7b157ed7c6cf2c02a4c633e38b8870468fe8a3c163eb18457ba329ddbba WatchSource:0}: Error finding container ca6eb7b157ed7c6cf2c02a4c633e38b8870468fe8a3c163eb18457ba329ddbba: Status 404 returned error can't find the container with id ca6eb7b157ed7c6cf2c02a4c633e38b8870468fe8a3c163eb18457ba329ddbba Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.144241 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 18:14:38 crc kubenswrapper[4835]: W1003 18:14:38.161777 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-92d8a863b54c369a1b00a970052ace56b9c5c2bac306371136fd187953fc2ff4 WatchSource:0}: Error finding container 92d8a863b54c369a1b00a970052ace56b9c5c2bac306371136fd187953fc2ff4: Status 404 returned error can't find the container with id 92d8a863b54c369a1b00a970052ace56b9c5c2bac306371136fd187953fc2ff4 Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.162153 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:38 crc kubenswrapper[4835]: W1003 18:14:38.162938 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-4709cbbaa28a24f8e0c44e95d6e5589efeaf44880b59fb860e845c17dcafe598 WatchSource:0}: Error finding container 4709cbbaa28a24f8e0c44e95d6e5589efeaf44880b59fb860e845c17dcafe598: Status 404 returned error can't find the container with id 4709cbbaa28a24f8e0c44e95d6e5589efeaf44880b59fb860e845c17dcafe598 Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.165806 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.174230 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.176916 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.186849 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.199627 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.210364 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.224735 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.239712 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.252231 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.271724 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.283633 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.293126 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.303141 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.316357 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.327320 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.340549 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.352178 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.471269 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.471417 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:14:39.47139922 +0000 UTC m=+21.187340092 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.571598 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.571637 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.571659 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.571680 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.571781 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.571809 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.571850 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.571907 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.571936 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.571944 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.571956 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.571968 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.571827 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:39.571813558 +0000 UTC m=+21.287754430 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.572019 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:39.571998043 +0000 UTC m=+21.287939005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.572044 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:39.572035214 +0000 UTC m=+21.287976206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.572110 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:39.572103066 +0000 UTC m=+21.288043938 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.876687 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.876803 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.880862 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.881453 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.882695 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.883355 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.884356 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.884862 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.885480 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.886403 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.886962 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.888149 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.888761 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.890274 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.890862 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.891488 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.892624 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.893236 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.894345 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.894721 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.895353 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.896962 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.897398 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.898578 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.899061 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.900186 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.900572 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.901194 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.902449 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.903101 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.904231 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.904797 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.905823 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.906006 4835 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.906196 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.907800 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.908785 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.909266 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.910776 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.911386 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.912271 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.912872 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.913842 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.914488 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.915396 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.915949 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.916904 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.917346 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.918207 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.918664 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.919714 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.920205 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.920959 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.921396 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.922327 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.922871 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.923346 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.940089 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.940496 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.956217 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.976395 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.980160 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.982944 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125"} Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.983111 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.983855 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4709cbbaa28a24f8e0c44e95d6e5589efeaf44880b59fb860e845c17dcafe598"} Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.985543 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629"} Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.985569 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3"} Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.985580 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"92d8a863b54c369a1b00a970052ace56b9c5c2bac306371136fd187953fc2ff4"} Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.986902 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4"} Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.986940 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ca6eb7b157ed7c6cf2c02a4c633e38b8870468fe8a3c163eb18457ba329ddbba"} Oct 03 18:14:38 crc kubenswrapper[4835]: I1003 18:14:38.992948 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:38 crc kubenswrapper[4835]: E1003 18:14:38.994978 4835 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.004244 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4x78q"] Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.004506 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4x78q" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.004854 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-dzgvb"] Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.005314 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-w4fql"] Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.005790 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.006248 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.011368 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.012036 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.012199 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.012268 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.012198 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.012355 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.012419 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.012548 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.012612 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.012578 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.012775 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.012893 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.012962 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.013101 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.028010 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.061750 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.075061 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10a8b8e7-c0f5-4c40-b0bd-b52379adae1f-proxy-tls\") pod \"machine-config-daemon-w4fql\" (UID: \"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\") " pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.075407 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgs5l\" (UniqueName: \"kubernetes.io/projected/10a8b8e7-c0f5-4c40-b0bd-b52379adae1f-kube-api-access-rgs5l\") pod \"machine-config-daemon-w4fql\" (UID: \"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\") " pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.075545 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84c53bc2-068f-4b54-9a51-1eee44a03e59-cnibin\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.075648 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10a8b8e7-c0f5-4c40-b0bd-b52379adae1f-mcd-auth-proxy-config\") pod \"machine-config-daemon-w4fql\" (UID: \"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\") " pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.075748 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84c53bc2-068f-4b54-9a51-1eee44a03e59-os-release\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.075840 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84c53bc2-068f-4b54-9a51-1eee44a03e59-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.075938 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjhfs\" (UniqueName: \"kubernetes.io/projected/84c53bc2-068f-4b54-9a51-1eee44a03e59-kube-api-access-fjhfs\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.076131 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2af8bc4b-5145-400f-a847-ef393bd84601-hosts-file\") pod \"node-resolver-4x78q\" (UID: \"2af8bc4b-5145-400f-a847-ef393bd84601\") " pod="openshift-dns/node-resolver-4x78q" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.076173 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84c53bc2-068f-4b54-9a51-1eee44a03e59-system-cni-dir\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.076194 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84c53bc2-068f-4b54-9a51-1eee44a03e59-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.076229 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht675\" (UniqueName: \"kubernetes.io/projected/2af8bc4b-5145-400f-a847-ef393bd84601-kube-api-access-ht675\") pod \"node-resolver-4x78q\" (UID: \"2af8bc4b-5145-400f-a847-ef393bd84601\") " pod="openshift-dns/node-resolver-4x78q" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.076249 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84c53bc2-068f-4b54-9a51-1eee44a03e59-cni-binary-copy\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.076267 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/10a8b8e7-c0f5-4c40-b0bd-b52379adae1f-rootfs\") pod \"machine-config-daemon-w4fql\" (UID: \"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\") " pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.088692 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.126026 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.141161 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.161967 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177592 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10a8b8e7-c0f5-4c40-b0bd-b52379adae1f-proxy-tls\") pod \"machine-config-daemon-w4fql\" (UID: \"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\") " pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177641 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgs5l\" (UniqueName: \"kubernetes.io/projected/10a8b8e7-c0f5-4c40-b0bd-b52379adae1f-kube-api-access-rgs5l\") pod \"machine-config-daemon-w4fql\" (UID: \"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\") " pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177664 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84c53bc2-068f-4b54-9a51-1eee44a03e59-cnibin\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177698 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10a8b8e7-c0f5-4c40-b0bd-b52379adae1f-mcd-auth-proxy-config\") pod \"machine-config-daemon-w4fql\" (UID: \"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\") " pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177717 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84c53bc2-068f-4b54-9a51-1eee44a03e59-os-release\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177736 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84c53bc2-068f-4b54-9a51-1eee44a03e59-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177762 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjhfs\" (UniqueName: \"kubernetes.io/projected/84c53bc2-068f-4b54-9a51-1eee44a03e59-kube-api-access-fjhfs\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177835 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2af8bc4b-5145-400f-a847-ef393bd84601-hosts-file\") pod \"node-resolver-4x78q\" (UID: \"2af8bc4b-5145-400f-a847-ef393bd84601\") " pod="openshift-dns/node-resolver-4x78q" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177870 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84c53bc2-068f-4b54-9a51-1eee44a03e59-system-cni-dir\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177891 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84c53bc2-068f-4b54-9a51-1eee44a03e59-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177924 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht675\" (UniqueName: \"kubernetes.io/projected/2af8bc4b-5145-400f-a847-ef393bd84601-kube-api-access-ht675\") pod \"node-resolver-4x78q\" (UID: \"2af8bc4b-5145-400f-a847-ef393bd84601\") " pod="openshift-dns/node-resolver-4x78q" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177950 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84c53bc2-068f-4b54-9a51-1eee44a03e59-cni-binary-copy\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177955 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2af8bc4b-5145-400f-a847-ef393bd84601-hosts-file\") pod \"node-resolver-4x78q\" (UID: \"2af8bc4b-5145-400f-a847-ef393bd84601\") " pod="openshift-dns/node-resolver-4x78q" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177979 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/10a8b8e7-c0f5-4c40-b0bd-b52379adae1f-rootfs\") pod \"machine-config-daemon-w4fql\" (UID: \"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\") " pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.178041 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/10a8b8e7-c0f5-4c40-b0bd-b52379adae1f-rootfs\") pod \"machine-config-daemon-w4fql\" (UID: \"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\") " pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.178119 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/84c53bc2-068f-4b54-9a51-1eee44a03e59-system-cni-dir\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177888 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/84c53bc2-068f-4b54-9a51-1eee44a03e59-os-release\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.177920 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/84c53bc2-068f-4b54-9a51-1eee44a03e59-cnibin\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.178436 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84c53bc2-068f-4b54-9a51-1eee44a03e59-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.178625 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10a8b8e7-c0f5-4c40-b0bd-b52379adae1f-mcd-auth-proxy-config\") pod \"machine-config-daemon-w4fql\" (UID: \"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\") " pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.178934 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84c53bc2-068f-4b54-9a51-1eee44a03e59-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.179805 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/84c53bc2-068f-4b54-9a51-1eee44a03e59-cni-binary-copy\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.180983 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.189727 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10a8b8e7-c0f5-4c40-b0bd-b52379adae1f-proxy-tls\") pod \"machine-config-daemon-w4fql\" (UID: \"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\") " pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.196092 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht675\" (UniqueName: \"kubernetes.io/projected/2af8bc4b-5145-400f-a847-ef393bd84601-kube-api-access-ht675\") pod \"node-resolver-4x78q\" (UID: \"2af8bc4b-5145-400f-a847-ef393bd84601\") " pod="openshift-dns/node-resolver-4x78q" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.196116 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgs5l\" (UniqueName: \"kubernetes.io/projected/10a8b8e7-c0f5-4c40-b0bd-b52379adae1f-kube-api-access-rgs5l\") pod \"machine-config-daemon-w4fql\" (UID: \"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\") " pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.199252 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjhfs\" (UniqueName: \"kubernetes.io/projected/84c53bc2-068f-4b54-9a51-1eee44a03e59-kube-api-access-fjhfs\") pod \"multus-additional-cni-plugins-dzgvb\" (UID: \"84c53bc2-068f-4b54-9a51-1eee44a03e59\") " pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.200135 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.216098 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.231173 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.245566 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.260669 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.276090 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.319550 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4x78q" Oct 03 18:14:39 crc kubenswrapper[4835]: W1003 18:14:39.329730 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2af8bc4b_5145_400f_a847_ef393bd84601.slice/crio-6fd435af7b90ca36aab152f844870b425fbe48ae1c595861d05ed95876ea10bc WatchSource:0}: Error finding container 6fd435af7b90ca36aab152f844870b425fbe48ae1c595861d05ed95876ea10bc: Status 404 returned error can't find the container with id 6fd435af7b90ca36aab152f844870b425fbe48ae1c595861d05ed95876ea10bc Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.341367 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.345666 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" Oct 03 18:14:39 crc kubenswrapper[4835]: W1003 18:14:39.358999 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10a8b8e7_c0f5_4c40_b0bd_b52379adae1f.slice/crio-c60a4fa4d4cc897a8a18273d1986131d62ee7f882d5b8d583202e0ff493a8c91 WatchSource:0}: Error finding container c60a4fa4d4cc897a8a18273d1986131d62ee7f882d5b8d583202e0ff493a8c91: Status 404 returned error can't find the container with id c60a4fa4d4cc897a8a18273d1986131d62ee7f882d5b8d583202e0ff493a8c91 Oct 03 18:14:39 crc kubenswrapper[4835]: W1003 18:14:39.362024 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84c53bc2_068f_4b54_9a51_1eee44a03e59.slice/crio-8545c7adfa73902ff1d116fe82f4ce51f241a0e9514669615d6a13b7bad2148a WatchSource:0}: Error finding container 8545c7adfa73902ff1d116fe82f4ce51f241a0e9514669615d6a13b7bad2148a: Status 404 returned error can't find the container with id 8545c7adfa73902ff1d116fe82f4ce51f241a0e9514669615d6a13b7bad2148a Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.403249 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8p9cd"] Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.403647 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.408437 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.409803 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.415769 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p2w8j"] Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.416637 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.429117 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.429242 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.429436 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.429522 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.429720 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.432087 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.436782 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.450785 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480448 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480576 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-systemd-units\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.480636 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:14:41.480607755 +0000 UTC m=+23.196548627 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480711 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-var-lib-openvswitch\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480736 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-cnibin\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480752 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-hostroot\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480769 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-systemd\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480783 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-multus-cni-dir\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480798 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-var-lib-cni-multus\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480814 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-slash\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480827 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-system-cni-dir\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480840 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-os-release\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480854 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-cni-bin\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480866 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-multus-conf-dir\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480879 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-run-netns\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480893 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-run-ovn-kubernetes\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480909 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480928 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-var-lib-cni-bin\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480946 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpncv\" (UniqueName: \"kubernetes.io/projected/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-kube-api-access-fpncv\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480959 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-ovn\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480974 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-cni-netd\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.480987 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovn-node-metrics-cert\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481004 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-multus-socket-dir-parent\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481017 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-multus-daemon-config\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481033 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-run-multus-certs\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481059 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-etc-kubernetes\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481088 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-kubelet\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481103 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-var-lib-kubelet\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481119 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-run-k8s-cni-cncf-io\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481139 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-run-netns\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481156 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9z72\" (UniqueName: \"kubernetes.io/projected/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-kube-api-access-m9z72\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481170 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-cni-binary-copy\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481191 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovnkube-config\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481206 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovnkube-script-lib\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481225 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-log-socket\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481239 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-env-overrides\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481252 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-node-log\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481279 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-etc-openvswitch\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.481292 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-openvswitch\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.489332 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.510151 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.524640 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.538447 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.556881 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.570819 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.581967 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-ovn\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582004 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-cni-netd\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582022 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovn-node-metrics-cert\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582039 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-run-multus-certs\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582062 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-multus-socket-dir-parent\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582094 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-multus-daemon-config\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582111 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582128 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-kubelet\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582120 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-ovn\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582147 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-var-lib-kubelet\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582162 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-etc-kubernetes\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582154 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-cni-netd\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582180 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9z72\" (UniqueName: \"kubernetes.io/projected/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-kube-api-access-m9z72\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582237 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-cni-binary-copy\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582255 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-run-k8s-cni-cncf-io\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582270 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-run-netns\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582289 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovnkube-config\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582303 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovnkube-script-lib\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582321 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-log-socket\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582336 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-env-overrides\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582356 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-etc-openvswitch\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582370 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-openvswitch\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582383 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-node-log\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582405 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582428 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-systemd-units\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582435 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-multus-socket-dir-parent\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582449 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582466 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-var-lib-openvswitch\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582480 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-cnibin\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582495 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-hostroot\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582512 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582528 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-multus-cni-dir\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582544 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-systemd\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582561 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-var-lib-cni-multus\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582574 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-slash\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582589 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-system-cni-dir\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582601 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-os-release\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582616 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-multus-conf-dir\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582633 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-cni-bin\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582649 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582663 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-var-lib-cni-bin\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582679 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-run-netns\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582694 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-run-ovn-kubernetes\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582710 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpncv\" (UniqueName: \"kubernetes.io/projected/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-kube-api-access-fpncv\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582719 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-run-multus-certs\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582722 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-kubelet\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582804 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-var-lib-kubelet\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.582822 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582835 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-etc-kubernetes\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.582876 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:41.582859471 +0000 UTC m=+23.298800413 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582879 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-cnibin\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.582971 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-hostroot\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583007 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-multus-daemon-config\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.583027 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.583040 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583046 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-etc-openvswitch\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.583049 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.583125 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:41.583110857 +0000 UTC m=+23.299051729 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583154 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-run-k8s-cni-cncf-io\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583157 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-multus-conf-dir\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583176 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-multus-cni-dir\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583176 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-run-netns\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583195 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-systemd\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583211 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-cni-bin\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583222 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-var-lib-cni-multus\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583231 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583248 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-slash\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583250 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-host-var-lib-cni-bin\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583264 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-run-netns\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583285 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-run-ovn-kubernetes\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583306 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-log-socket\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583368 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-system-cni-dir\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583395 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-cni-binary-copy\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583434 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-openvswitch\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583465 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-node-log\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583494 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-os-release\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.583539 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.583582 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:41.58356741 +0000 UTC m=+23.299508282 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583610 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-systemd-units\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.583668 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.583679 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583680 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovnkube-config\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.583689 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.583715 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:41.583707394 +0000 UTC m=+23.299648256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.583718 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-var-lib-openvswitch\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.584044 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-env-overrides\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.584155 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovnkube-script-lib\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.587374 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovn-node-metrics-cert\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.588822 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.598598 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpncv\" (UniqueName: \"kubernetes.io/projected/fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93-kube-api-access-fpncv\") pod \"multus-8p9cd\" (UID: \"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\") " pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.602112 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9z72\" (UniqueName: \"kubernetes.io/projected/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-kube-api-access-m9z72\") pod \"ovnkube-node-p2w8j\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.604077 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.618018 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.635674 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.649495 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.668831 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.684601 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.700848 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.712003 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.725575 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.728524 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8p9cd" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.737789 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: W1003 18:14:39.739388 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd3bdc71_e8c7_4cfa_9230_5bb1c413ae93.slice/crio-6f327bb39a80d929db2e56228d5562687fe35bb8bbee89452885d83c124898d7 WatchSource:0}: Error finding container 6f327bb39a80d929db2e56228d5562687fe35bb8bbee89452885d83c124898d7: Status 404 returned error can't find the container with id 6f327bb39a80d929db2e56228d5562687fe35bb8bbee89452885d83c124898d7 Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.749368 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.760252 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.768746 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: W1003 18:14:39.774374 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48bbeb2a_b75a_4650_b5ea_b180b8c0168a.slice/crio-daaf33b06dcdd3e280762ba7c200e6bd76ea98f11b244c4a36077335274dd0f4 WatchSource:0}: Error finding container daaf33b06dcdd3e280762ba7c200e6bd76ea98f11b244c4a36077335274dd0f4: Status 404 returned error can't find the container with id daaf33b06dcdd3e280762ba7c200e6bd76ea98f11b244c4a36077335274dd0f4 Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.790937 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.805550 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.814346 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.831423 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.853257 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.876440 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.876521 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.876602 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:14:39 crc kubenswrapper[4835]: E1003 18:14:39.876651 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.991618 4835 generic.go:334] "Generic (PLEG): container finished" podID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerID="6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc" exitCode=0 Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.991692 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc"} Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.991718 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerStarted","Data":"daaf33b06dcdd3e280762ba7c200e6bd76ea98f11b244c4a36077335274dd0f4"} Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.993599 4835 generic.go:334] "Generic (PLEG): container finished" podID="84c53bc2-068f-4b54-9a51-1eee44a03e59" containerID="61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284" exitCode=0 Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.993665 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" event={"ID":"84c53bc2-068f-4b54-9a51-1eee44a03e59","Type":"ContainerDied","Data":"61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284"} Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.993691 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" event={"ID":"84c53bc2-068f-4b54-9a51-1eee44a03e59","Type":"ContainerStarted","Data":"8545c7adfa73902ff1d116fe82f4ce51f241a0e9514669615d6a13b7bad2148a"} Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.995213 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f"} Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.995246 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf"} Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.995257 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"c60a4fa4d4cc897a8a18273d1986131d62ee7f882d5b8d583202e0ff493a8c91"} Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.996434 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8p9cd" event={"ID":"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93","Type":"ContainerStarted","Data":"d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33"} Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.996474 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8p9cd" event={"ID":"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93","Type":"ContainerStarted","Data":"6f327bb39a80d929db2e56228d5562687fe35bb8bbee89452885d83c124898d7"} Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.998324 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4x78q" event={"ID":"2af8bc4b-5145-400f-a847-ef393bd84601","Type":"ContainerStarted","Data":"06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4"} Oct 03 18:14:39 crc kubenswrapper[4835]: I1003 18:14:39.998360 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4x78q" event={"ID":"2af8bc4b-5145-400f-a847-ef393bd84601","Type":"ContainerStarted","Data":"6fd435af7b90ca36aab152f844870b425fbe48ae1c595861d05ed95876ea10bc"} Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.007102 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.025984 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.040436 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.064102 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.075631 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.087956 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.099733 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.112264 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.125407 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.138540 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.177183 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.222892 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.275731 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.301986 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.331234 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.361108 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.372969 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.394303 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.411266 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.424219 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.447370 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.481784 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.526648 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.562574 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.604403 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.643101 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.702186 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.715707 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.718301 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.719378 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.742269 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.783862 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.821994 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.862425 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.876290 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:40 crc kubenswrapper[4835]: E1003 18:14:40.876398 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.903761 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.942459 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:40 crc kubenswrapper[4835]: I1003 18:14:40.987099 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:40Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.002735 4835 generic.go:334] "Generic (PLEG): container finished" podID="84c53bc2-068f-4b54-9a51-1eee44a03e59" containerID="7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73" exitCode=0 Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.002806 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" event={"ID":"84c53bc2-068f-4b54-9a51-1eee44a03e59","Type":"ContainerDied","Data":"7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73"} Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.004185 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b"} Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.007841 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerStarted","Data":"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db"} Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.007881 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerStarted","Data":"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea"} Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.007892 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerStarted","Data":"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c"} Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.007902 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerStarted","Data":"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b"} Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.022591 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.066691 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.103894 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.143319 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.185251 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.225991 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.267261 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.309784 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.341241 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.383643 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.423508 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.462379 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.501402 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.503698 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.503860 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:14:45.50383189 +0000 UTC m=+27.219772752 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.541746 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.582399 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.605235 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.605294 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.605327 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.605349 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.605371 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.605431 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:45.605415979 +0000 UTC m=+27.321356851 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.605461 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.605471 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.605491 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.605504 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.605528 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:45.605511702 +0000 UTC m=+27.321452644 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.605548 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:45.605537103 +0000 UTC m=+27.321478065 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.605616 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.605665 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.605686 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.605772 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:45.605748788 +0000 UTC m=+27.321689690 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.623553 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.662743 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.702235 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.744674 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:41Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.876189 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:41 crc kubenswrapper[4835]: I1003 18:14:41.876234 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.876310 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:14:41 crc kubenswrapper[4835]: E1003 18:14:41.876450 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.012422 4835 generic.go:334] "Generic (PLEG): container finished" podID="84c53bc2-068f-4b54-9a51-1eee44a03e59" containerID="6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85" exitCode=0 Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.012509 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" event={"ID":"84c53bc2-068f-4b54-9a51-1eee44a03e59","Type":"ContainerDied","Data":"6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85"} Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.016743 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerStarted","Data":"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8"} Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.016796 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerStarted","Data":"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9"} Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.036778 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.061452 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.073000 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.085991 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.100443 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.110544 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.121737 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.132495 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.142093 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.161768 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.210381 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.225886 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.264894 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.305200 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.370244 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zsch7"] Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.370827 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zsch7" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.372370 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.372465 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.374005 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.395175 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.422676 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.463332 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.502239 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.513635 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e63fda18-1d89-4268-aa9a-9e04f6c1539e-serviceca\") pod \"node-ca-zsch7\" (UID: \"e63fda18-1d89-4268-aa9a-9e04f6c1539e\") " pod="openshift-image-registry/node-ca-zsch7" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.513662 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2844\" (UniqueName: \"kubernetes.io/projected/e63fda18-1d89-4268-aa9a-9e04f6c1539e-kube-api-access-t2844\") pod \"node-ca-zsch7\" (UID: \"e63fda18-1d89-4268-aa9a-9e04f6c1539e\") " pod="openshift-image-registry/node-ca-zsch7" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.513698 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e63fda18-1d89-4268-aa9a-9e04f6c1539e-host\") pod \"node-ca-zsch7\" (UID: \"e63fda18-1d89-4268-aa9a-9e04f6c1539e\") " pod="openshift-image-registry/node-ca-zsch7" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.542083 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.581863 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.614689 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e63fda18-1d89-4268-aa9a-9e04f6c1539e-serviceca\") pod \"node-ca-zsch7\" (UID: \"e63fda18-1d89-4268-aa9a-9e04f6c1539e\") " pod="openshift-image-registry/node-ca-zsch7" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.614723 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2844\" (UniqueName: \"kubernetes.io/projected/e63fda18-1d89-4268-aa9a-9e04f6c1539e-kube-api-access-t2844\") pod \"node-ca-zsch7\" (UID: \"e63fda18-1d89-4268-aa9a-9e04f6c1539e\") " pod="openshift-image-registry/node-ca-zsch7" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.614755 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e63fda18-1d89-4268-aa9a-9e04f6c1539e-host\") pod \"node-ca-zsch7\" (UID: \"e63fda18-1d89-4268-aa9a-9e04f6c1539e\") " pod="openshift-image-registry/node-ca-zsch7" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.614829 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e63fda18-1d89-4268-aa9a-9e04f6c1539e-host\") pod \"node-ca-zsch7\" (UID: \"e63fda18-1d89-4268-aa9a-9e04f6c1539e\") " pod="openshift-image-registry/node-ca-zsch7" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.615626 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e63fda18-1d89-4268-aa9a-9e04f6c1539e-serviceca\") pod \"node-ca-zsch7\" (UID: \"e63fda18-1d89-4268-aa9a-9e04f6c1539e\") " pod="openshift-image-registry/node-ca-zsch7" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.622712 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.650881 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2844\" (UniqueName: \"kubernetes.io/projected/e63fda18-1d89-4268-aa9a-9e04f6c1539e-kube-api-access-t2844\") pod \"node-ca-zsch7\" (UID: \"e63fda18-1d89-4268-aa9a-9e04f6c1539e\") " pod="openshift-image-registry/node-ca-zsch7" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.685776 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zsch7" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.687712 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: W1003 18:14:42.697686 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode63fda18_1d89_4268_aa9a_9e04f6c1539e.slice/crio-b1905092a0c5ca99949d21119e2b8c854a71ea60fa434e1bb9e74b98d5d9baac WatchSource:0}: Error finding container b1905092a0c5ca99949d21119e2b8c854a71ea60fa434e1bb9e74b98d5d9baac: Status 404 returned error can't find the container with id b1905092a0c5ca99949d21119e2b8c854a71ea60fa434e1bb9e74b98d5d9baac Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.723934 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.762940 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.802556 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.843051 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.876284 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:42 crc kubenswrapper[4835]: E1003 18:14:42.876402 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.888810 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.922776 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:42 crc kubenswrapper[4835]: I1003 18:14:42.967842 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.001172 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:42Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.020474 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zsch7" event={"ID":"e63fda18-1d89-4268-aa9a-9e04f6c1539e","Type":"ContainerStarted","Data":"dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1"} Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.020534 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zsch7" event={"ID":"e63fda18-1d89-4268-aa9a-9e04f6c1539e","Type":"ContainerStarted","Data":"b1905092a0c5ca99949d21119e2b8c854a71ea60fa434e1bb9e74b98d5d9baac"} Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.024296 4835 generic.go:334] "Generic (PLEG): container finished" podID="84c53bc2-068f-4b54-9a51-1eee44a03e59" containerID="d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9" exitCode=0 Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.024333 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" event={"ID":"84c53bc2-068f-4b54-9a51-1eee44a03e59","Type":"ContainerDied","Data":"d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9"} Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.046856 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.082936 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.123671 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.167691 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.201761 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.248377 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.282154 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.323031 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.362611 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.401714 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.447753 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.480957 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.524223 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.553348 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.554927 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.554961 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.554970 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.555061 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.566296 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.615832 4835 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.616102 4835 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.617045 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.617085 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.617098 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.617113 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.617121 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:43Z","lastTransitionTime":"2025-10-03T18:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:43 crc kubenswrapper[4835]: E1003 18:14:43.629578 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.632589 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.632615 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.632626 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.632642 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.632653 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:43Z","lastTransitionTime":"2025-10-03T18:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:43 crc kubenswrapper[4835]: E1003 18:14:43.643280 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.643684 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.646354 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.646385 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.646393 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.646407 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.646416 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:43Z","lastTransitionTime":"2025-10-03T18:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:43 crc kubenswrapper[4835]: E1003 18:14:43.657167 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.659962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.660006 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.660016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.660033 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.660047 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:43Z","lastTransitionTime":"2025-10-03T18:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:43 crc kubenswrapper[4835]: E1003 18:14:43.672440 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.675979 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.676023 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.676035 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.676054 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.676093 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:43Z","lastTransitionTime":"2025-10-03T18:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.684031 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: E1003 18:14:43.686741 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: E1003 18:14:43.686873 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.688240 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.688280 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.688290 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.688306 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.688315 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:43Z","lastTransitionTime":"2025-10-03T18:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.721053 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.762466 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.790120 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.790155 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.790163 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.790176 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.790184 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:43Z","lastTransitionTime":"2025-10-03T18:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.803378 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.846992 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.876617 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.876651 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:43 crc kubenswrapper[4835]: E1003 18:14:43.876732 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:14:43 crc kubenswrapper[4835]: E1003 18:14:43.876799 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.889026 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.892490 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.892518 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.892528 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.892542 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.892556 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:43Z","lastTransitionTime":"2025-10-03T18:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.922289 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.963587 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:43Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.994827 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.994878 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.994987 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.995031 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:43 crc kubenswrapper[4835]: I1003 18:14:43.995052 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:43Z","lastTransitionTime":"2025-10-03T18:14:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.011109 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.030329 4835 generic.go:334] "Generic (PLEG): container finished" podID="84c53bc2-068f-4b54-9a51-1eee44a03e59" containerID="5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333" exitCode=0 Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.030385 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" event={"ID":"84c53bc2-068f-4b54-9a51-1eee44a03e59","Type":"ContainerDied","Data":"5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333"} Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.035107 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerStarted","Data":"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af"} Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.042486 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.091406 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.097812 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.097843 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.097851 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.097864 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.097873 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:44Z","lastTransitionTime":"2025-10-03T18:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.121129 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.163502 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.200878 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.200961 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.200978 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.200996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.201034 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:44Z","lastTransitionTime":"2025-10-03T18:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.203410 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.247825 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.294485 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.303782 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.303823 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.303831 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.303846 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.303856 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:44Z","lastTransitionTime":"2025-10-03T18:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.321110 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.364504 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.406421 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.406480 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.406490 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.406514 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.406527 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:44Z","lastTransitionTime":"2025-10-03T18:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.407167 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.441690 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.483872 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.508748 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.508776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.508784 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.508799 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.508807 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:44Z","lastTransitionTime":"2025-10-03T18:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.525702 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.564255 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.603314 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.611197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.611240 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.611252 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.611270 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.611282 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:44Z","lastTransitionTime":"2025-10-03T18:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.642211 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.682838 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.713730 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.713778 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.713799 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.713820 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.713833 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:44Z","lastTransitionTime":"2025-10-03T18:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.722621 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.762338 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.803993 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.815938 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.815962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.815972 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.815985 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.815993 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:44Z","lastTransitionTime":"2025-10-03T18:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.844911 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:44Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.876376 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:44 crc kubenswrapper[4835]: E1003 18:14:44.876524 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.918263 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.918300 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.918309 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.918324 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:44 crc kubenswrapper[4835]: I1003 18:14:44.918333 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:44Z","lastTransitionTime":"2025-10-03T18:14:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.020729 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.020768 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.020777 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.020791 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.020800 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:45Z","lastTransitionTime":"2025-10-03T18:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.041206 4835 generic.go:334] "Generic (PLEG): container finished" podID="84c53bc2-068f-4b54-9a51-1eee44a03e59" containerID="8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863" exitCode=0 Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.041248 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" event={"ID":"84c53bc2-068f-4b54-9a51-1eee44a03e59","Type":"ContainerDied","Data":"8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863"} Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.057260 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.072589 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.087363 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.099329 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.110084 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.123881 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.124152 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.124178 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.124187 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.124199 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.124207 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:45Z","lastTransitionTime":"2025-10-03T18:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.136005 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.164437 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.203325 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.226931 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.226969 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.226979 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.226995 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.227006 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:45Z","lastTransitionTime":"2025-10-03T18:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.241605 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.286623 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.320998 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.329323 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.329360 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.329370 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.329387 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.329422 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:45Z","lastTransitionTime":"2025-10-03T18:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.367928 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.401439 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.431336 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.431578 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.431651 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.431765 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.431847 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:45Z","lastTransitionTime":"2025-10-03T18:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.443579 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.534825 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.534864 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.534872 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.534885 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.534893 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:45Z","lastTransitionTime":"2025-10-03T18:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.544329 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.544485 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:14:53.544467559 +0000 UTC m=+35.260408421 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.636636 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.636678 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.636686 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.636701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.636710 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:45Z","lastTransitionTime":"2025-10-03T18:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.645416 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.645471 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.645498 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.645518 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.645536 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.645615 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:53.645593716 +0000 UTC m=+35.361534588 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.645634 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.645637 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.645733 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.645764 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.645777 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.645743 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:53.645726239 +0000 UTC m=+35.361667111 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.645844 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:53.645827163 +0000 UTC m=+35.361768035 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.645650 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.645859 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.645890 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:53.645884514 +0000 UTC m=+35.361825386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.739127 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.739157 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.739165 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.739178 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.739189 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:45Z","lastTransitionTime":"2025-10-03T18:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.841649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.841692 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.841701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.841717 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.841729 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:45Z","lastTransitionTime":"2025-10-03T18:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.876105 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.876108 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.876237 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:14:45 crc kubenswrapper[4835]: E1003 18:14:45.876346 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.943268 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.943310 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.943321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.943338 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:45 crc kubenswrapper[4835]: I1003 18:14:45.943351 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:45Z","lastTransitionTime":"2025-10-03T18:14:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.045154 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.045195 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.045206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.045221 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.045232 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:46Z","lastTransitionTime":"2025-10-03T18:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.050823 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerStarted","Data":"2edcb0f841037e045e7e84fea3c91bbae7503d0e9f076e1421f0fae34ef47b83"} Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.051199 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.056981 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" event={"ID":"84c53bc2-068f-4b54-9a51-1eee44a03e59","Type":"ContainerStarted","Data":"07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7"} Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.067210 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.076581 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.085375 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.098551 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.110254 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.121344 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.134803 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.147614 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.147650 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.147659 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.147672 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.147681 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:46Z","lastTransitionTime":"2025-10-03T18:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.157731 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.179993 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.193172 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.205913 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.227354 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.236675 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.247980 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.249290 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.249335 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.249345 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.249400 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.249411 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:46Z","lastTransitionTime":"2025-10-03T18:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.265158 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2edcb0f841037e045e7e84fea3c91bbae7503d0e9f076e1421f0fae34ef47b83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.274864 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.292562 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.302604 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.313596 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.329544 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2edcb0f841037e045e7e84fea3c91bbae7503d0e9f076e1421f0fae34ef47b83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.338568 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.348651 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.350910 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.350949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.350960 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.350974 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.350984 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:46Z","lastTransitionTime":"2025-10-03T18:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.359815 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.370642 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.404050 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.443164 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.452537 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.452573 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.452582 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.452598 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.452607 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:46Z","lastTransitionTime":"2025-10-03T18:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.482000 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.522491 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.554680 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.554712 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.554721 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.554734 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.554743 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:46Z","lastTransitionTime":"2025-10-03T18:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.561797 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.604440 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.641563 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.657096 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.657133 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.657144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.657159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.657170 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:46Z","lastTransitionTime":"2025-10-03T18:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.759789 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.759838 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.759849 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.759867 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.759877 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:46Z","lastTransitionTime":"2025-10-03T18:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.862483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.862527 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.862538 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.862554 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.862564 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:46Z","lastTransitionTime":"2025-10-03T18:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.876307 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:46 crc kubenswrapper[4835]: E1003 18:14:46.876415 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.964456 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.964494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.964502 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.964516 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:46 crc kubenswrapper[4835]: I1003 18:14:46.964525 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:46Z","lastTransitionTime":"2025-10-03T18:14:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.059129 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.059523 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.065988 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.066029 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.066042 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.066061 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.066097 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:47Z","lastTransitionTime":"2025-10-03T18:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.077616 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.092504 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.104398 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.117413 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.130262 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.141125 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.151785 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.164882 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.168093 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.168129 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.168138 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.168151 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.168159 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:47Z","lastTransitionTime":"2025-10-03T18:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.176141 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.187155 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.200200 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.217309 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.228392 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.241627 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.258062 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2edcb0f841037e045e7e84fea3c91bbae7503d0e9f076e1421f0fae34ef47b83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.267567 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.270102 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.270137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.270149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.270166 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.270176 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:47Z","lastTransitionTime":"2025-10-03T18:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.373191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.373437 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.373446 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.373459 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.373468 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:47Z","lastTransitionTime":"2025-10-03T18:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.475394 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.475433 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.475441 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.475460 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.475469 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:47Z","lastTransitionTime":"2025-10-03T18:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.577770 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.577816 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.577826 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.577842 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.577852 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:47Z","lastTransitionTime":"2025-10-03T18:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.680280 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.680551 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.680652 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.680739 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.680843 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:47Z","lastTransitionTime":"2025-10-03T18:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.783524 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.783572 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.783587 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.783612 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.783626 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:47Z","lastTransitionTime":"2025-10-03T18:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.876838 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.876838 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:47 crc kubenswrapper[4835]: E1003 18:14:47.877025 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:14:47 crc kubenswrapper[4835]: E1003 18:14:47.876959 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.885913 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.885950 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.885958 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.885973 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.885982 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:47Z","lastTransitionTime":"2025-10-03T18:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.988388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.988427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.988435 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.988450 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:47 crc kubenswrapper[4835]: I1003 18:14:47.988459 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:47Z","lastTransitionTime":"2025-10-03T18:14:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.062181 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.091517 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.091548 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.091557 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.091570 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.091578 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:48Z","lastTransitionTime":"2025-10-03T18:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.193688 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.193727 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.193735 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.193749 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.193758 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:48Z","lastTransitionTime":"2025-10-03T18:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.296506 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.296545 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.296553 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.296569 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.296578 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:48Z","lastTransitionTime":"2025-10-03T18:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.399057 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.399134 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.399146 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.399164 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.399174 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:48Z","lastTransitionTime":"2025-10-03T18:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.501575 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.501634 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.501646 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.501664 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.501678 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:48Z","lastTransitionTime":"2025-10-03T18:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.603878 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.603914 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.603921 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.603935 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.603945 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:48Z","lastTransitionTime":"2025-10-03T18:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.706288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.706321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.706339 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.706356 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.706367 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:48Z","lastTransitionTime":"2025-10-03T18:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.808610 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.808648 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.808657 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.808672 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.808681 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:48Z","lastTransitionTime":"2025-10-03T18:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.876286 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:48 crc kubenswrapper[4835]: E1003 18:14:48.876402 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.893726 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2edcb0f841037e045e7e84fea3c91bbae7503d0e9f076e1421f0fae34ef47b83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.905117 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.910709 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.910769 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.910779 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.910792 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.910802 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:48Z","lastTransitionTime":"2025-10-03T18:14:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.925551 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.935089 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.945615 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.957524 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.973941 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.987814 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:48 crc kubenswrapper[4835]: I1003 18:14:48.998810 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.013843 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.013884 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.013894 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.013910 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.013921 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:49Z","lastTransitionTime":"2025-10-03T18:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.016039 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.031622 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.043746 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.062323 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.065895 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/0.log" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.068313 4835 generic.go:334] "Generic (PLEG): container finished" podID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerID="2edcb0f841037e045e7e84fea3c91bbae7503d0e9f076e1421f0fae34ef47b83" exitCode=1 Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.068349 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"2edcb0f841037e045e7e84fea3c91bbae7503d0e9f076e1421f0fae34ef47b83"} Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.068956 4835 scope.go:117] "RemoveContainer" containerID="2edcb0f841037e045e7e84fea3c91bbae7503d0e9f076e1421f0fae34ef47b83" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.077967 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.090717 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.103707 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.115046 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.116742 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.116847 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.116924 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.117002 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.117104 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:49Z","lastTransitionTime":"2025-10-03T18:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.126945 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.139890 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.154415 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.167714 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.179116 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.189709 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.205484 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.219039 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.219095 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.219107 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.219122 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.219133 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:49Z","lastTransitionTime":"2025-10-03T18:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.219257 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.238266 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.252111 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.264798 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.284654 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2edcb0f841037e045e7e84fea3c91bbae7503d0e9f076e1421f0fae34ef47b83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2edcb0f841037e045e7e84fea3c91bbae7503d0e9f076e1421f0fae34ef47b83\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:14:48Z\\\",\\\"message\\\":\\\"r removal\\\\nI1003 18:14:48.223462 6132 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 18:14:48.223467 6132 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 18:14:48.223464 6132 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 18:14:48.223484 6132 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 18:14:48.223479 6132 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 18:14:48.223505 6132 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 18:14:48.223512 6132 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 18:14:48.223513 6132 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 18:14:48.223492 6132 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 18:14:48.223552 6132 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 18:14:48.223558 6132 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 18:14:48.223570 6132 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 18:14:48.223573 6132 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 18:14:48.223592 6132 factory.go:656] Stopping watch factory\\\\nI1003 18:14:48.223594 6132 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 18:14:48.223603 6132 ovnkube.go:599] Stopped ovnkube\\\\nI1003 18:14:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.294886 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.322150 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.322203 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.322213 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.322229 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.322255 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:49Z","lastTransitionTime":"2025-10-03T18:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.424442 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.424501 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.424512 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.424550 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.424563 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:49Z","lastTransitionTime":"2025-10-03T18:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.526418 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.526473 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.526483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.526495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.526507 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:49Z","lastTransitionTime":"2025-10-03T18:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.628276 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.628317 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.628328 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.628346 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.628356 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:49Z","lastTransitionTime":"2025-10-03T18:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.730361 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.730403 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.730411 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.730427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.730438 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:49Z","lastTransitionTime":"2025-10-03T18:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.832308 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.832607 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.832616 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.832630 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.832642 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:49Z","lastTransitionTime":"2025-10-03T18:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.876106 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.876145 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:49 crc kubenswrapper[4835]: E1003 18:14:49.876245 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:14:49 crc kubenswrapper[4835]: E1003 18:14:49.876353 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.934694 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.934732 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.934742 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.934755 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:49 crc kubenswrapper[4835]: I1003 18:14:49.934765 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:49Z","lastTransitionTime":"2025-10-03T18:14:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.036751 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.036783 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.036807 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.036822 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.036831 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:50Z","lastTransitionTime":"2025-10-03T18:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.072909 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/1.log" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.073569 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/0.log" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.075913 4835 generic.go:334] "Generic (PLEG): container finished" podID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerID="3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f" exitCode=1 Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.075967 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f"} Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.076025 4835 scope.go:117] "RemoveContainer" containerID="2edcb0f841037e045e7e84fea3c91bbae7503d0e9f076e1421f0fae34ef47b83" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.076448 4835 scope.go:117] "RemoveContainer" containerID="3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f" Oct 03 18:14:50 crc kubenswrapper[4835]: E1003 18:14:50.076577 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.093584 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.108808 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.125687 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.138186 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.138998 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.139029 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.139039 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.139053 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.139064 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:50Z","lastTransitionTime":"2025-10-03T18:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.151464 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.164201 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.177175 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.192056 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.203387 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.217814 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.237891 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.241218 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.241253 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.241263 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.241276 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.241286 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:50Z","lastTransitionTime":"2025-10-03T18:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.250241 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.266231 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.293731 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2edcb0f841037e045e7e84fea3c91bbae7503d0e9f076e1421f0fae34ef47b83\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:14:48Z\\\",\\\"message\\\":\\\"r removal\\\\nI1003 18:14:48.223462 6132 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 18:14:48.223467 6132 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 18:14:48.223464 6132 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 18:14:48.223484 6132 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1003 18:14:48.223479 6132 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 18:14:48.223505 6132 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 18:14:48.223512 6132 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 18:14:48.223513 6132 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 18:14:48.223492 6132 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1003 18:14:48.223552 6132 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 18:14:48.223558 6132 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 18:14:48.223570 6132 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 18:14:48.223573 6132 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 18:14:48.223592 6132 factory.go:656] Stopping watch factory\\\\nI1003 18:14:48.223594 6132 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 18:14:48.223603 6132 ovnkube.go:599] Stopped ovnkube\\\\nI1003 18:14:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:14:49Z\\\",\\\"message\\\":\\\" {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 18:14:49.779440 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 18:14:49.779450 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 18:14:49.779439 6261 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1003 18:14:49.779503 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.307186 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.343920 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.343962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.343970 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.343984 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.343995 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:50Z","lastTransitionTime":"2025-10-03T18:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.446539 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.446582 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.446591 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.446606 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.446617 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:50Z","lastTransitionTime":"2025-10-03T18:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.549285 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.549333 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.549342 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.549357 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.549368 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:50Z","lastTransitionTime":"2025-10-03T18:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.652102 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.652136 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.652145 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.652158 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.652167 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:50Z","lastTransitionTime":"2025-10-03T18:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.753788 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.753828 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.753839 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.753854 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.753866 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:50Z","lastTransitionTime":"2025-10-03T18:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.856568 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.856621 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.856637 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.856658 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.856669 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:50Z","lastTransitionTime":"2025-10-03T18:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.876923 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:50 crc kubenswrapper[4835]: E1003 18:14:50.877059 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.959270 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.959304 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.959313 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.959325 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:50 crc kubenswrapper[4835]: I1003 18:14:50.959335 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:50Z","lastTransitionTime":"2025-10-03T18:14:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.061859 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.061894 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.061906 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.061922 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.061933 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:51Z","lastTransitionTime":"2025-10-03T18:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.079834 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/1.log" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.082605 4835 scope.go:117] "RemoveContainer" containerID="3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f" Oct 03 18:14:51 crc kubenswrapper[4835]: E1003 18:14:51.082732 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.095428 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.109216 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.119882 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.132895 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.144133 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.154118 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.164534 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.164585 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.164599 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.164618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.164631 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:51Z","lastTransitionTime":"2025-10-03T18:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.165435 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.174002 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.193336 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.202669 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.214051 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.248402 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:14:49Z\\\",\\\"message\\\":\\\" {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 18:14:49.779440 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 18:14:49.779450 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 18:14:49.779439 6261 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1003 18:14:49.779503 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.267453 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.267490 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.267500 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.267515 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.267524 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:51Z","lastTransitionTime":"2025-10-03T18:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.276277 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.288086 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.300661 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.369699 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.369729 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.369741 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.369753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.369762 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:51Z","lastTransitionTime":"2025-10-03T18:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.471406 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.471440 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.471450 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.471464 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.471473 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:51Z","lastTransitionTime":"2025-10-03T18:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.573855 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.573903 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.573915 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.573934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.573948 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:51Z","lastTransitionTime":"2025-10-03T18:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.676755 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.676792 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.676800 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.676813 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.676821 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:51Z","lastTransitionTime":"2025-10-03T18:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.778752 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.778786 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.778795 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.778807 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.778817 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:51Z","lastTransitionTime":"2025-10-03T18:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.876178 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.876254 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:51 crc kubenswrapper[4835]: E1003 18:14:51.876407 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:14:51 crc kubenswrapper[4835]: E1003 18:14:51.876331 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.877653 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j"] Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.878113 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.880254 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.880938 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.880965 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.880994 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.881008 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.881019 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:51Z","lastTransitionTime":"2025-10-03T18:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.881567 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.893614 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.906875 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.919431 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.930834 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.941454 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.951346 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.963507 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.977141 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.983572 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.983613 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.983626 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.983643 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.983655 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:51Z","lastTransitionTime":"2025-10-03T18:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:51 crc kubenswrapper[4835]: I1003 18:14:51.989030 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.000364 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.001791 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnmm8\" (UniqueName: \"kubernetes.io/projected/1782f6fb-6c25-419c-914a-9f88c72af1bc-kube-api-access-nnmm8\") pod \"ovnkube-control-plane-749d76644c-n7z6j\" (UID: \"1782f6fb-6c25-419c-914a-9f88c72af1bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.001845 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1782f6fb-6c25-419c-914a-9f88c72af1bc-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-n7z6j\" (UID: \"1782f6fb-6c25-419c-914a-9f88c72af1bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.001916 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1782f6fb-6c25-419c-914a-9f88c72af1bc-env-overrides\") pod \"ovnkube-control-plane-749d76644c-n7z6j\" (UID: \"1782f6fb-6c25-419c-914a-9f88c72af1bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.001968 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1782f6fb-6c25-419c-914a-9f88c72af1bc-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-n7z6j\" (UID: \"1782f6fb-6c25-419c-914a-9f88c72af1bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.013120 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.026719 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.045931 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:14:49Z\\\",\\\"message\\\":\\\" {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 18:14:49.779440 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 18:14:49.779450 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 18:14:49.779439 6261 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1003 18:14:49.779503 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.060822 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.085248 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.085317 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.085332 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.085349 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.085361 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:52Z","lastTransitionTime":"2025-10-03T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.096626 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.102979 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1782f6fb-6c25-419c-914a-9f88c72af1bc-env-overrides\") pod \"ovnkube-control-plane-749d76644c-n7z6j\" (UID: \"1782f6fb-6c25-419c-914a-9f88c72af1bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.103026 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1782f6fb-6c25-419c-914a-9f88c72af1bc-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-n7z6j\" (UID: \"1782f6fb-6c25-419c-914a-9f88c72af1bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.103174 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnmm8\" (UniqueName: \"kubernetes.io/projected/1782f6fb-6c25-419c-914a-9f88c72af1bc-kube-api-access-nnmm8\") pod \"ovnkube-control-plane-749d76644c-n7z6j\" (UID: \"1782f6fb-6c25-419c-914a-9f88c72af1bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.103214 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1782f6fb-6c25-419c-914a-9f88c72af1bc-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-n7z6j\" (UID: \"1782f6fb-6c25-419c-914a-9f88c72af1bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.103560 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1782f6fb-6c25-419c-914a-9f88c72af1bc-env-overrides\") pod \"ovnkube-control-plane-749d76644c-n7z6j\" (UID: \"1782f6fb-6c25-419c-914a-9f88c72af1bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.104126 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1782f6fb-6c25-419c-914a-9f88c72af1bc-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-n7z6j\" (UID: \"1782f6fb-6c25-419c-914a-9f88c72af1bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.109664 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1782f6fb-6c25-419c-914a-9f88c72af1bc-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-n7z6j\" (UID: \"1782f6fb-6c25-419c-914a-9f88c72af1bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.111107 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.117923 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnmm8\" (UniqueName: \"kubernetes.io/projected/1782f6fb-6c25-419c-914a-9f88c72af1bc-kube-api-access-nnmm8\") pod \"ovnkube-control-plane-749d76644c-n7z6j\" (UID: \"1782f6fb-6c25-419c-914a-9f88c72af1bc\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.188239 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.188281 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.188292 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.188327 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.188339 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:52Z","lastTransitionTime":"2025-10-03T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.190540 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" Oct 03 18:14:52 crc kubenswrapper[4835]: W1003 18:14:52.212799 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1782f6fb_6c25_419c_914a_9f88c72af1bc.slice/crio-f14a13d13f189fb5149da208d2c24dac4037bccd901db51888c49b5256525db3 WatchSource:0}: Error finding container f14a13d13f189fb5149da208d2c24dac4037bccd901db51888c49b5256525db3: Status 404 returned error can't find the container with id f14a13d13f189fb5149da208d2c24dac4037bccd901db51888c49b5256525db3 Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.290328 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.290368 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.290381 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.290397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.290412 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:52Z","lastTransitionTime":"2025-10-03T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.392697 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.392744 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.392755 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.392774 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.392786 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:52Z","lastTransitionTime":"2025-10-03T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.495849 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.495890 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.495900 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.495916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.495927 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:52Z","lastTransitionTime":"2025-10-03T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.598550 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.598587 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.598595 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.598610 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.598621 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:52Z","lastTransitionTime":"2025-10-03T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.700615 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.700649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.700657 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.700670 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.700682 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:52Z","lastTransitionTime":"2025-10-03T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.803202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.803426 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.803525 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.803604 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.803687 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:52Z","lastTransitionTime":"2025-10-03T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.876814 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:52 crc kubenswrapper[4835]: E1003 18:14:52.877153 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.906154 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.906193 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.906201 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.906213 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.906223 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:52Z","lastTransitionTime":"2025-10-03T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.955500 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vlmkl"] Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.956058 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:14:52 crc kubenswrapper[4835]: E1003 18:14:52.956166 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.968870 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.979739 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:52 crc kubenswrapper[4835]: I1003 18:14:52.988850 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.000022 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.008578 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.008608 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.008618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.008636 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.008647 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.011088 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.020421 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.031765 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.042652 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.054796 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.063413 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.073464 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.082135 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.089141 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" event={"ID":"1782f6fb-6c25-419c-914a-9f88c72af1bc","Type":"ContainerStarted","Data":"43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b"} Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.089174 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" event={"ID":"1782f6fb-6c25-419c-914a-9f88c72af1bc","Type":"ContainerStarted","Data":"c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8"} Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.089189 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" event={"ID":"1782f6fb-6c25-419c-914a-9f88c72af1bc","Type":"ContainerStarted","Data":"f14a13d13f189fb5149da208d2c24dac4037bccd901db51888c49b5256525db3"} Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.094157 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.110488 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.110520 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.110531 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.110544 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.110553 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.112833 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.112870 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm88q\" (UniqueName: \"kubernetes.io/projected/e2705556-f411-476d-9d8a-78543bae8dc7-kube-api-access-xm88q\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.116282 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:14:49Z\\\",\\\"message\\\":\\\" {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 18:14:49.779440 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 18:14:49.779450 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 18:14:49.779439 6261 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1003 18:14:49.779503 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.125136 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.147443 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.158278 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.168198 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.179890 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.198181 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:14:49Z\\\",\\\"message\\\":\\\" {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 18:14:49.779440 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 18:14:49.779450 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 18:14:49.779439 6261 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1003 18:14:49.779503 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.208204 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.212908 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.212953 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.212966 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.212981 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.212990 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.213502 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.213537 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm88q\" (UniqueName: \"kubernetes.io/projected/e2705556-f411-476d-9d8a-78543bae8dc7-kube-api-access-xm88q\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.213919 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.213981 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs podName:e2705556-f411-476d-9d8a-78543bae8dc7 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:53.713967062 +0000 UTC m=+35.429907934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs") pod "network-metrics-daemon-vlmkl" (UID: "e2705556-f411-476d-9d8a-78543bae8dc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.225903 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.229153 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm88q\" (UniqueName: \"kubernetes.io/projected/e2705556-f411-476d-9d8a-78543bae8dc7-kube-api-access-xm88q\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.236453 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.247840 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.258952 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.268774 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.281730 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.292448 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.303292 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.313453 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.314475 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.314504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.314514 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.314529 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.314538 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.327539 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.340420 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.350379 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.360248 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.415909 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.415945 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.415954 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.415968 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.415977 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.518314 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.518352 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.518360 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.518371 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.518380 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.615945 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.616057 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:15:09.616033495 +0000 UTC m=+51.331974367 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.620752 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.620786 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.620797 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.620813 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.620822 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.717111 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.717163 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.717188 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.717207 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.717227 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717312 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717334 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717349 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717365 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717375 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717398 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:15:09.717376827 +0000 UTC m=+51.433317699 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717334 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717409 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717440 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717450 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717421 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs podName:e2705556-f411-476d-9d8a-78543bae8dc7 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:54.717412648 +0000 UTC m=+36.433353620 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs") pod "network-metrics-daemon-vlmkl" (UID: "e2705556-f411-476d-9d8a-78543bae8dc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717492 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 18:15:09.71748279 +0000 UTC m=+51.433423662 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717504 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:15:09.71749829 +0000 UTC m=+51.433439162 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.717514 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 18:15:09.7175095 +0000 UTC m=+51.433450372 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.723303 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.723338 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.723349 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.723365 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.723375 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.825656 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.825690 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.825698 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.825710 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.825719 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.853517 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.853552 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.853561 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.853573 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.853582 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.871113 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.875568 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.875604 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.875614 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.875630 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.875641 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.875878 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.875947 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.875995 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.876162 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.888139 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.891836 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.891917 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.891940 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.891963 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.891983 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.908179 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.911694 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.911725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.911733 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.911746 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.911757 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.922295 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.925146 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.925186 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.925197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.925210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.925219 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.936614 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:53 crc kubenswrapper[4835]: E1003 18:14:53.936734 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.938012 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.938041 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.938049 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.938062 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:53 crc kubenswrapper[4835]: I1003 18:14:53.938098 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:53Z","lastTransitionTime":"2025-10-03T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.040571 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.040614 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.040622 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.040639 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.040650 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:54Z","lastTransitionTime":"2025-10-03T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.143243 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.143279 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.143289 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.143306 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.143317 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:54Z","lastTransitionTime":"2025-10-03T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.246601 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.246908 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.246920 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.246935 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.246944 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:54Z","lastTransitionTime":"2025-10-03T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.349028 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.349101 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.349116 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.349137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.349208 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:54Z","lastTransitionTime":"2025-10-03T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.451251 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.451289 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.451299 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.451314 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.451323 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:54Z","lastTransitionTime":"2025-10-03T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.553527 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.553559 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.553567 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.553602 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.553613 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:54Z","lastTransitionTime":"2025-10-03T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.655756 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.655798 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.655809 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.655835 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.655847 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:54Z","lastTransitionTime":"2025-10-03T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.728563 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:14:54 crc kubenswrapper[4835]: E1003 18:14:54.728685 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:14:54 crc kubenswrapper[4835]: E1003 18:14:54.728738 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs podName:e2705556-f411-476d-9d8a-78543bae8dc7 nodeName:}" failed. No retries permitted until 2025-10-03 18:14:56.728724679 +0000 UTC m=+38.444665551 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs") pod "network-metrics-daemon-vlmkl" (UID: "e2705556-f411-476d-9d8a-78543bae8dc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.759038 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.759104 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.759164 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.759183 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.759193 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:54Z","lastTransitionTime":"2025-10-03T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.861404 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.861453 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.861465 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.861482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.861493 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:54Z","lastTransitionTime":"2025-10-03T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.876715 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.876769 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:54 crc kubenswrapper[4835]: E1003 18:14:54.876836 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:14:54 crc kubenswrapper[4835]: E1003 18:14:54.876930 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.963434 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.963483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.963491 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.963504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:54 crc kubenswrapper[4835]: I1003 18:14:54.963519 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:54Z","lastTransitionTime":"2025-10-03T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.065586 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.065619 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.065632 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.065647 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.065658 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:55Z","lastTransitionTime":"2025-10-03T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.167448 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.167485 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.167494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.167507 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.167517 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:55Z","lastTransitionTime":"2025-10-03T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.269951 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.269992 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.270004 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.270019 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.270029 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:55Z","lastTransitionTime":"2025-10-03T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.372274 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.372309 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.372317 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.372330 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.372341 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:55Z","lastTransitionTime":"2025-10-03T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.474823 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.474858 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.474866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.474880 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.474888 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:55Z","lastTransitionTime":"2025-10-03T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.577224 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.577260 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.577270 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.577284 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.577294 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:55Z","lastTransitionTime":"2025-10-03T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.679244 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.679283 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.679295 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.679311 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.679322 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:55Z","lastTransitionTime":"2025-10-03T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.781831 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.781918 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.781926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.781943 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.781952 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:55Z","lastTransitionTime":"2025-10-03T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.876146 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.876181 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:55 crc kubenswrapper[4835]: E1003 18:14:55.876288 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:14:55 crc kubenswrapper[4835]: E1003 18:14:55.876369 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.884625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.884680 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.884692 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.884708 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.884719 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:55Z","lastTransitionTime":"2025-10-03T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.987299 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.987344 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.987352 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.987366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:55 crc kubenswrapper[4835]: I1003 18:14:55.987374 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:55Z","lastTransitionTime":"2025-10-03T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.089428 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.089630 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.089718 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.089779 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.089840 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:56Z","lastTransitionTime":"2025-10-03T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.191881 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.192387 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.192452 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.192569 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.192667 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:56Z","lastTransitionTime":"2025-10-03T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.295535 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.295572 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.295583 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.295598 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.295608 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:56Z","lastTransitionTime":"2025-10-03T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.398232 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.398283 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.398294 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.398318 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.398330 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:56Z","lastTransitionTime":"2025-10-03T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.500642 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.500921 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.501052 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.501189 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.501276 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:56Z","lastTransitionTime":"2025-10-03T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.603231 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.603268 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.603277 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.603290 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.603299 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:56Z","lastTransitionTime":"2025-10-03T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.705707 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.705745 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.705754 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.705769 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.705778 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:56Z","lastTransitionTime":"2025-10-03T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.748348 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:14:56 crc kubenswrapper[4835]: E1003 18:14:56.748503 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:14:56 crc kubenswrapper[4835]: E1003 18:14:56.748557 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs podName:e2705556-f411-476d-9d8a-78543bae8dc7 nodeName:}" failed. No retries permitted until 2025-10-03 18:15:00.748543665 +0000 UTC m=+42.464484537 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs") pod "network-metrics-daemon-vlmkl" (UID: "e2705556-f411-476d-9d8a-78543bae8dc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.807875 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.807911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.807921 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.807937 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.807950 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:56Z","lastTransitionTime":"2025-10-03T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.876228 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.876324 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:14:56 crc kubenswrapper[4835]: E1003 18:14:56.876372 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:14:56 crc kubenswrapper[4835]: E1003 18:14:56.876464 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.909673 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.909926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.909962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.909981 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:56 crc kubenswrapper[4835]: I1003 18:14:56.910010 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:56Z","lastTransitionTime":"2025-10-03T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.014565 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.014613 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.014624 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.014665 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.014699 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:57Z","lastTransitionTime":"2025-10-03T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.117746 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.117797 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.117815 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.117852 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.117870 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:57Z","lastTransitionTime":"2025-10-03T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.219963 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.220009 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.220018 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.220035 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.220045 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:57Z","lastTransitionTime":"2025-10-03T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.322040 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.322088 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.322096 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.322111 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.322123 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:57Z","lastTransitionTime":"2025-10-03T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.423969 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.424003 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.424011 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.424024 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.424032 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:57Z","lastTransitionTime":"2025-10-03T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.526250 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.526294 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.526305 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.526321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.526332 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:57Z","lastTransitionTime":"2025-10-03T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.628861 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.628900 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.628909 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.628922 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.628933 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:57Z","lastTransitionTime":"2025-10-03T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.635878 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.656717 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.668316 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.682744 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.704790 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:14:49Z\\\",\\\"message\\\":\\\" {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 18:14:49.779440 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 18:14:49.779450 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 18:14:49.779439 6261 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1003 18:14:49.779503 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.716045 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.726461 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.730868 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.730899 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.730907 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.730920 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.730930 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:57Z","lastTransitionTime":"2025-10-03T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.741692 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.754330 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.769116 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.779547 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.790952 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.803983 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.815500 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.827870 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.832682 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.832773 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.832786 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.832803 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.832814 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:57Z","lastTransitionTime":"2025-10-03T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.841029 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.857016 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.873686 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:57Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.876867 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.876903 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:57 crc kubenswrapper[4835]: E1003 18:14:57.876991 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:14:57 crc kubenswrapper[4835]: E1003 18:14:57.877064 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.935199 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.935230 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.935238 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.935250 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:57 crc kubenswrapper[4835]: I1003 18:14:57.935259 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:57Z","lastTransitionTime":"2025-10-03T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.037550 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.037587 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.037595 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.037611 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.037629 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:58Z","lastTransitionTime":"2025-10-03T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.139775 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.139826 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.139835 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.139851 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.139865 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:58Z","lastTransitionTime":"2025-10-03T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.241497 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.241536 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.241545 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.241559 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.241569 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:58Z","lastTransitionTime":"2025-10-03T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.343683 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.343733 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.343744 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.343761 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.343772 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:58Z","lastTransitionTime":"2025-10-03T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.446392 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.446440 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.446450 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.446465 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.446473 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:58Z","lastTransitionTime":"2025-10-03T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.549300 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.549343 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.549352 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.549366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.549375 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:58Z","lastTransitionTime":"2025-10-03T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.651534 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.651576 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.651587 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.651604 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.651615 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:58Z","lastTransitionTime":"2025-10-03T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.753359 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.753395 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.753403 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.753415 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.753425 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:58Z","lastTransitionTime":"2025-10-03T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.855777 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.855812 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.855821 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.855843 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.855855 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:58Z","lastTransitionTime":"2025-10-03T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.875807 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.875818 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:14:58 crc kubenswrapper[4835]: E1003 18:14:58.875996 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:14:58 crc kubenswrapper[4835]: E1003 18:14:58.876123 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.887046 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.899977 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.911874 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.923528 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.935545 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.947647 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.957965 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.958019 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.958034 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.958052 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.958087 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:58Z","lastTransitionTime":"2025-10-03T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.963393 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.976273 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.987478 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:58 crc kubenswrapper[4835]: I1003 18:14:58.998156 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.009110 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.027318 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:14:49Z\\\",\\\"message\\\":\\\" {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 18:14:49.779440 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 18:14:49.779450 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 18:14:49.779439 6261 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1003 18:14:49.779503 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.036594 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.057735 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.060046 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.060094 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.060108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.060124 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.060136 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:59Z","lastTransitionTime":"2025-10-03T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.068013 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.079942 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.089384 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.162064 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.162124 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.162134 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.162152 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.162162 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:59Z","lastTransitionTime":"2025-10-03T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.264345 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.264388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.264397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.264410 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.264419 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:59Z","lastTransitionTime":"2025-10-03T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.366373 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.366418 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.366430 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.366446 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.366457 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:59Z","lastTransitionTime":"2025-10-03T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.468621 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.468663 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.468681 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.468701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.468709 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:59Z","lastTransitionTime":"2025-10-03T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.571034 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.571086 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.571099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.571116 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.571127 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:59Z","lastTransitionTime":"2025-10-03T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.673370 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.673404 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.673414 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.673428 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.673439 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:59Z","lastTransitionTime":"2025-10-03T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.775672 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.775737 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.775749 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.775768 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.775780 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:59Z","lastTransitionTime":"2025-10-03T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.876318 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.876362 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:14:59 crc kubenswrapper[4835]: E1003 18:14:59.876475 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:14:59 crc kubenswrapper[4835]: E1003 18:14:59.876559 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.877668 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.877744 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.877759 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.877776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.877788 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:59Z","lastTransitionTime":"2025-10-03T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.979539 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.979582 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.979591 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.979617 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:14:59 crc kubenswrapper[4835]: I1003 18:14:59.979627 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:14:59Z","lastTransitionTime":"2025-10-03T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.081776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.081812 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.081824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.081837 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.081848 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:00Z","lastTransitionTime":"2025-10-03T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.183799 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.183844 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.183856 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.183876 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.183887 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:00Z","lastTransitionTime":"2025-10-03T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.285915 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.285944 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.285952 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.285968 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.285977 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:00Z","lastTransitionTime":"2025-10-03T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.387986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.388028 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.388039 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.388054 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.388081 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:00Z","lastTransitionTime":"2025-10-03T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.490235 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.490265 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.490274 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.490285 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.490294 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:00Z","lastTransitionTime":"2025-10-03T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.592326 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.592366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.592377 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.592394 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.592406 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:00Z","lastTransitionTime":"2025-10-03T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.694732 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.694776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.694788 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.694804 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.694814 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:00Z","lastTransitionTime":"2025-10-03T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.784779 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:00 crc kubenswrapper[4835]: E1003 18:15:00.784926 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:15:00 crc kubenswrapper[4835]: E1003 18:15:00.784988 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs podName:e2705556-f411-476d-9d8a-78543bae8dc7 nodeName:}" failed. No retries permitted until 2025-10-03 18:15:08.784973684 +0000 UTC m=+50.500914556 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs") pod "network-metrics-daemon-vlmkl" (UID: "e2705556-f411-476d-9d8a-78543bae8dc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.797087 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.797132 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.797144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.797157 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.797166 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:00Z","lastTransitionTime":"2025-10-03T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.876786 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.876858 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:00 crc kubenswrapper[4835]: E1003 18:15:00.876916 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:00 crc kubenswrapper[4835]: E1003 18:15:00.877025 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.899269 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.899302 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.899310 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.899324 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:00 crc kubenswrapper[4835]: I1003 18:15:00.899332 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:00Z","lastTransitionTime":"2025-10-03T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.010059 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.010139 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.010152 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.010171 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.010183 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:01Z","lastTransitionTime":"2025-10-03T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.111751 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.111798 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.111809 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.111825 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.111835 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:01Z","lastTransitionTime":"2025-10-03T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.214117 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.214153 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.214162 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.214175 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.214185 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:01Z","lastTransitionTime":"2025-10-03T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.316386 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.316446 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.316458 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.316471 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.316480 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:01Z","lastTransitionTime":"2025-10-03T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.418261 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.418302 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.418309 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.418323 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.418334 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:01Z","lastTransitionTime":"2025-10-03T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.520264 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.520306 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.520315 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.520333 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.520342 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:01Z","lastTransitionTime":"2025-10-03T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.622324 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.622368 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.622375 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.622391 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.622401 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:01Z","lastTransitionTime":"2025-10-03T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.724170 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.724209 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.724218 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.724231 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.724240 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:01Z","lastTransitionTime":"2025-10-03T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.826751 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.826795 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.826806 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.826826 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.826841 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:01Z","lastTransitionTime":"2025-10-03T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.876624 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.876682 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:01 crc kubenswrapper[4835]: E1003 18:15:01.876733 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:01 crc kubenswrapper[4835]: E1003 18:15:01.876855 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.928618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.928657 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.928666 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.928684 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:01 crc kubenswrapper[4835]: I1003 18:15:01.928695 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:01Z","lastTransitionTime":"2025-10-03T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.031162 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.031192 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.031202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.031216 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.031227 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:02Z","lastTransitionTime":"2025-10-03T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.133266 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.133301 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.133311 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.133325 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.133334 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:02Z","lastTransitionTime":"2025-10-03T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.235731 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.235764 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.235773 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.235787 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.235796 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:02Z","lastTransitionTime":"2025-10-03T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.337738 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.337766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.337774 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.337788 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.337797 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:02Z","lastTransitionTime":"2025-10-03T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.440206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.440267 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.440286 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.440307 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.440318 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:02Z","lastTransitionTime":"2025-10-03T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.543463 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.543503 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.543513 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.543531 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.543541 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:02Z","lastTransitionTime":"2025-10-03T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.646014 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.646238 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.646262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.646286 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.646296 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:02Z","lastTransitionTime":"2025-10-03T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.748544 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.748579 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.748588 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.748602 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.748615 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:02Z","lastTransitionTime":"2025-10-03T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.850793 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.850848 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.850860 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.850879 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.850891 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:02Z","lastTransitionTime":"2025-10-03T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.876134 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.876181 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:02 crc kubenswrapper[4835]: E1003 18:15:02.876310 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:02 crc kubenswrapper[4835]: E1003 18:15:02.876439 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.952522 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.952550 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.952558 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.952573 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:02 crc kubenswrapper[4835]: I1003 18:15:02.952583 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:02Z","lastTransitionTime":"2025-10-03T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.054974 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.055249 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.055328 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.055396 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.055460 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:03Z","lastTransitionTime":"2025-10-03T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.158228 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.158269 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.158278 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.158294 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.158319 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:03Z","lastTransitionTime":"2025-10-03T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.260238 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.260283 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.260294 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.260308 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.260318 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:03Z","lastTransitionTime":"2025-10-03T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.362254 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.362302 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.362319 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.362338 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.362349 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:03Z","lastTransitionTime":"2025-10-03T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.465189 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.465252 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.465265 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.465283 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.465303 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:03Z","lastTransitionTime":"2025-10-03T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.567748 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.568024 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.568132 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.568206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.568272 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:03Z","lastTransitionTime":"2025-10-03T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.670395 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.670705 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.670781 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.670849 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.670920 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:03Z","lastTransitionTime":"2025-10-03T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.773641 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.773689 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.773700 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.773717 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.773729 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:03Z","lastTransitionTime":"2025-10-03T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.875800 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.875892 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.876365 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.876407 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:03 crc kubenswrapper[4835]: E1003 18:15:03.876397 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.876419 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.876464 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.876477 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:03Z","lastTransitionTime":"2025-10-03T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:03 crc kubenswrapper[4835]: E1003 18:15:03.876378 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.978564 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.978602 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.978610 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.978625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:03 crc kubenswrapper[4835]: I1003 18:15:03.978637 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:03Z","lastTransitionTime":"2025-10-03T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.080678 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.080994 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.081092 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.081170 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.081229 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.182725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.182758 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.182769 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.182784 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.182797 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.284853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.285154 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.285220 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.285303 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.285363 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.313247 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.313286 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.313298 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.313343 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.313356 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: E1003 18:15:04.324875 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:04Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.327742 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.327886 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.327959 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.328030 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.328116 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: E1003 18:15:04.338228 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:04Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.341729 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.341842 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.341911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.341985 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.342053 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: E1003 18:15:04.351796 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:04Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.355571 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.355608 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.355617 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.355632 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.355642 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: E1003 18:15:04.368774 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:04Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.371832 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.371863 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.371877 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.371894 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.371904 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: E1003 18:15:04.381552 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:04Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:04 crc kubenswrapper[4835]: E1003 18:15:04.381657 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.387665 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.387691 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.387700 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.387711 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.387721 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.489943 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.489993 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.490009 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.490028 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.490039 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.592063 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.592124 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.592140 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.592159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.592170 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.694469 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.694520 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.694538 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.694560 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.694576 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.796559 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.796610 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.796623 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.796641 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.796652 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.875857 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.875857 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:04 crc kubenswrapper[4835]: E1003 18:15:04.875965 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:04 crc kubenswrapper[4835]: E1003 18:15:04.876127 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.877221 4835 scope.go:117] "RemoveContainer" containerID="3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.899324 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.899363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.899372 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.899388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:04 crc kubenswrapper[4835]: I1003 18:15:04.899398 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:04Z","lastTransitionTime":"2025-10-03T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.001038 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.001335 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.001348 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.001363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.001374 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:05Z","lastTransitionTime":"2025-10-03T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.027447 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.103262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.103297 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.103307 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.103322 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.103334 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:05Z","lastTransitionTime":"2025-10-03T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.122181 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/1.log" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.124767 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerStarted","Data":"18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4"} Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.125101 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.143588 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.153380 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.166354 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.182945 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:14:49Z\\\",\\\"message\\\":\\\" {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 18:14:49.779440 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 18:14:49.779450 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 18:14:49.779439 6261 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1003 18:14:49.779503 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.193192 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.203761 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.205135 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.205159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.205168 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.205181 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.205189 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:05Z","lastTransitionTime":"2025-10-03T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.220930 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.237282 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.248346 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.263005 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.283239 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.299512 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.306811 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.306862 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.306875 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.306894 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.306905 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:05Z","lastTransitionTime":"2025-10-03T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.310526 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.321091 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.331615 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.343875 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.352660 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.409502 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.409541 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.409552 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.409571 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.409582 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:05Z","lastTransitionTime":"2025-10-03T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.512563 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.512600 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.512610 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.512624 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.512633 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:05Z","lastTransitionTime":"2025-10-03T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.615263 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.615304 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.615315 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.615337 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.615347 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:05Z","lastTransitionTime":"2025-10-03T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.717931 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.717972 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.717981 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.717996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.718014 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:05Z","lastTransitionTime":"2025-10-03T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.820624 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.820671 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.820682 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.820699 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.820711 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:05Z","lastTransitionTime":"2025-10-03T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.876028 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.876028 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:05 crc kubenswrapper[4835]: E1003 18:15:05.876164 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:05 crc kubenswrapper[4835]: E1003 18:15:05.876215 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.923062 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.923126 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.923145 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.923164 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:05 crc kubenswrapper[4835]: I1003 18:15:05.923223 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:05Z","lastTransitionTime":"2025-10-03T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.025934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.025981 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.025998 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.026019 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.026032 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:06Z","lastTransitionTime":"2025-10-03T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.127748 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.127801 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.127818 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.127841 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.127854 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:06Z","lastTransitionTime":"2025-10-03T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.129890 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/2.log" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.130468 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/1.log" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.132365 4835 generic.go:334] "Generic (PLEG): container finished" podID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerID="18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4" exitCode=1 Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.132400 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4"} Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.132434 4835 scope.go:117] "RemoveContainer" containerID="3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.133257 4835 scope.go:117] "RemoveContainer" containerID="18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4" Oct 03 18:15:06 crc kubenswrapper[4835]: E1003 18:15:06.133461 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.142499 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.155390 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.165145 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.176600 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.187400 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.197713 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.207423 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.217685 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.230089 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.230358 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.230509 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.230522 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.230534 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.230543 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:06Z","lastTransitionTime":"2025-10-03T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.240377 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.250965 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.265010 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.275610 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.291306 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f65f9b42dbcfdd2ed45f85f8fbf22b31ea6a6a6ec713f63aadf03afaa03459f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:14:49Z\\\",\\\"message\\\":\\\" {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 18:14:49.779440 6261 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 18:14:49.779450 6261 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 18:14:49.779439 6261 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1003 18:14:49.779503 6261 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:05Z\\\",\\\"message\\\":\\\"g zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 18:15:05.650934 6487 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 807.71µs\\\\nI1003 18:15:05.650937 6487 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 785.9µs\\\\nI1003 18:15:05.650872 6487 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 18:15:05.650944 6487 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.300425 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.317141 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.327449 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.333193 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.333232 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.333243 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.333262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.333274 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:06Z","lastTransitionTime":"2025-10-03T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.435442 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.435492 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.435504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.435517 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.435527 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:06Z","lastTransitionTime":"2025-10-03T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.537598 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.537635 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.537647 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.537664 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.537674 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:06Z","lastTransitionTime":"2025-10-03T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.639726 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.639762 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.639770 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.639783 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.639791 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:06Z","lastTransitionTime":"2025-10-03T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.742016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.742049 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.742056 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.742081 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.742090 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:06Z","lastTransitionTime":"2025-10-03T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.845089 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.845141 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.845154 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.845169 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.845179 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:06Z","lastTransitionTime":"2025-10-03T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.876866 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.876942 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:06 crc kubenswrapper[4835]: E1003 18:15:06.876994 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:06 crc kubenswrapper[4835]: E1003 18:15:06.877093 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.948102 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.948144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.948155 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.948172 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:06 crc kubenswrapper[4835]: I1003 18:15:06.948183 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:06Z","lastTransitionTime":"2025-10-03T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.050246 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.050280 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.050288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.050301 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.050311 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:07Z","lastTransitionTime":"2025-10-03T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.140441 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/2.log" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.144099 4835 scope.go:117] "RemoveContainer" containerID="18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4" Oct 03 18:15:07 crc kubenswrapper[4835]: E1003 18:15:07.144272 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.151953 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.151988 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.151999 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.152016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.152029 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:07Z","lastTransitionTime":"2025-10-03T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.154867 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.167536 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.183898 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:05Z\\\",\\\"message\\\":\\\"g zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 18:15:05.650934 6487 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 807.71µs\\\\nI1003 18:15:05.650937 6487 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 785.9µs\\\\nI1003 18:15:05.650872 6487 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 18:15:05.650944 6487 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.195783 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.213708 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.223895 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.234928 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.246099 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.253931 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.253979 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.253991 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.254009 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.254027 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:07Z","lastTransitionTime":"2025-10-03T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.256223 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.268468 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.279764 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.290087 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.304407 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.315562 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.328158 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.338105 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.348973 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.356376 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.356402 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.356413 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.356429 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.356441 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:07Z","lastTransitionTime":"2025-10-03T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.458246 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.458296 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.458305 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.458322 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.458330 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:07Z","lastTransitionTime":"2025-10-03T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.560330 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.560359 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.560368 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.560382 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.560391 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:07Z","lastTransitionTime":"2025-10-03T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.662849 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.662893 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.662901 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.662916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.662925 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:07Z","lastTransitionTime":"2025-10-03T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.765504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.765549 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.765560 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.765578 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.765592 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:07Z","lastTransitionTime":"2025-10-03T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.867254 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.868406 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.868435 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.868443 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.868458 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.868468 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:07Z","lastTransitionTime":"2025-10-03T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.876508 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:07 crc kubenswrapper[4835]: E1003 18:15:07.876854 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.876889 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:07 crc kubenswrapper[4835]: E1003 18:15:07.877130 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.876920 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.878572 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.889511 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.902185 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.913493 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.924237 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.936191 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.946654 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.960463 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.970621 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.971748 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.971778 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.971787 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.971803 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.971813 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:07Z","lastTransitionTime":"2025-10-03T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.982778 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:07 crc kubenswrapper[4835]: I1003 18:15:07.993685 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:07Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.003126 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:08Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.018112 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:05Z\\\",\\\"message\\\":\\\"g zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 18:15:05.650934 6487 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 807.71µs\\\\nI1003 18:15:05.650937 6487 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 785.9µs\\\\nI1003 18:15:05.650872 6487 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 18:15:05.650944 6487 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:08Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.026530 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:08Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.041602 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:08Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.052280 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:08Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.064322 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:08Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.073811 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.073941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.074022 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.074149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.074247 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:08Z","lastTransitionTime":"2025-10-03T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.176968 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.177004 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.177012 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.177040 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.177050 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:08Z","lastTransitionTime":"2025-10-03T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.279530 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.279858 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.279980 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.280140 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.280286 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:08Z","lastTransitionTime":"2025-10-03T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.382312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.382353 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.382363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.382376 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.382386 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:08Z","lastTransitionTime":"2025-10-03T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.485007 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.485049 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.485060 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.485094 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.485107 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:08Z","lastTransitionTime":"2025-10-03T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.587156 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.587198 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.587208 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.587226 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.587235 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:08Z","lastTransitionTime":"2025-10-03T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.690110 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.690176 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.690199 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.690224 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.690241 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:08Z","lastTransitionTime":"2025-10-03T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.792594 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.792651 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.792669 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.792691 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.792708 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:08Z","lastTransitionTime":"2025-10-03T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.860234 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:08 crc kubenswrapper[4835]: E1003 18:15:08.860494 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:15:08 crc kubenswrapper[4835]: E1003 18:15:08.860596 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs podName:e2705556-f411-476d-9d8a-78543bae8dc7 nodeName:}" failed. No retries permitted until 2025-10-03 18:15:24.86057154 +0000 UTC m=+66.576512482 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs") pod "network-metrics-daemon-vlmkl" (UID: "e2705556-f411-476d-9d8a-78543bae8dc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.876300 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:08 crc kubenswrapper[4835]: E1003 18:15:08.876439 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.876515 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:08 crc kubenswrapper[4835]: E1003 18:15:08.876711 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.893819 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:08Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.894533 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.894599 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.894616 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.894640 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.894658 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:08Z","lastTransitionTime":"2025-10-03T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.907607 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:08Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.922982 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:08Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.935833 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:08Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.947822 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:08Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.959828 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:08Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.983450 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:08Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.997205 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.997251 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.997260 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.997274 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:08 crc kubenswrapper[4835]: I1003 18:15:08.997284 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:08Z","lastTransitionTime":"2025-10-03T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.003050 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:09Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.013175 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:09Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.025740 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:09Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.045954 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:05Z\\\",\\\"message\\\":\\\"g zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 18:15:05.650934 6487 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 807.71µs\\\\nI1003 18:15:05.650937 6487 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 785.9µs\\\\nI1003 18:15:05.650872 6487 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 18:15:05.650944 6487 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:09Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.056938 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:09Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.067557 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5385e85b-313e-4b33-bf09-0b8f5e0a994a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e35992e79cb51b47dd78356feedcca634b95f0fbd0aac49017b88d555dab225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b9902dcf4b7c66e119168c3b3eb90f437ca1e723186a43c648d19f4101b851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aa63760f1ef0079ef65f47e141a5938e22007e5f3bf41a1e61491359b1eb0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:09Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.077452 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:09Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.091373 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:09Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.099042 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.099090 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.099101 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.099117 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.099129 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:09Z","lastTransitionTime":"2025-10-03T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.105194 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:09Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.118262 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:09Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.128526 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:09Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.201098 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.201366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.201467 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.201579 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.201676 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:09Z","lastTransitionTime":"2025-10-03T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.304732 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.304768 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.304778 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.304814 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.304824 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:09Z","lastTransitionTime":"2025-10-03T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.407790 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.407828 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.407839 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.407854 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.407864 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:09Z","lastTransitionTime":"2025-10-03T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.510693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.510730 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.510739 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.510756 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.510765 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:09Z","lastTransitionTime":"2025-10-03T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.612607 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.612666 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.612679 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.612695 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.612704 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:09Z","lastTransitionTime":"2025-10-03T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.668356 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.668542 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:15:41.668506166 +0000 UTC m=+83.384447038 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.714917 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.714959 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.714997 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.715016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.715026 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:09Z","lastTransitionTime":"2025-10-03T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.770640 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.770712 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.770743 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.770767 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.771010 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.771094 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.771025 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.771190 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.771239 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.771610 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.771156 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:15:41.771127472 +0000 UTC m=+83.487068354 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.771676 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.771703 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.771723 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:15:41.771679437 +0000 UTC m=+83.487620309 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.771747 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 18:15:41.771735148 +0000 UTC m=+83.487676020 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.771787 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 18:15:41.771767129 +0000 UTC m=+83.487708001 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.817568 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.817598 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.817606 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.817618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.817626 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:09Z","lastTransitionTime":"2025-10-03T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.876664 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.876664 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.876773 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:09 crc kubenswrapper[4835]: E1003 18:15:09.876833 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.919923 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.919966 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.919976 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.919991 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:09 crc kubenswrapper[4835]: I1003 18:15:09.920002 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:09Z","lastTransitionTime":"2025-10-03T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.021966 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.022620 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.022644 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.022661 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.022671 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:10Z","lastTransitionTime":"2025-10-03T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.125122 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.125158 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.125168 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.125182 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.125191 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:10Z","lastTransitionTime":"2025-10-03T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.227202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.227240 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.227249 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.227266 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.227280 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:10Z","lastTransitionTime":"2025-10-03T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.329937 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.329984 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.329997 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.330013 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.330025 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:10Z","lastTransitionTime":"2025-10-03T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.431933 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.431968 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.431978 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.431994 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.432005 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:10Z","lastTransitionTime":"2025-10-03T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.534414 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.534454 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.534466 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.534482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.534494 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:10Z","lastTransitionTime":"2025-10-03T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.636471 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.636509 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.636518 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.636530 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.636539 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:10Z","lastTransitionTime":"2025-10-03T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.738537 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.738578 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.738586 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.738599 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.738608 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:10Z","lastTransitionTime":"2025-10-03T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.841185 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.841223 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.841233 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.841249 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.841257 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:10Z","lastTransitionTime":"2025-10-03T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.875860 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.875922 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:10 crc kubenswrapper[4835]: E1003 18:15:10.875991 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:10 crc kubenswrapper[4835]: E1003 18:15:10.876039 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.943202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.943231 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.943239 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.943251 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:10 crc kubenswrapper[4835]: I1003 18:15:10.943260 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:10Z","lastTransitionTime":"2025-10-03T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.045350 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.045392 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.045402 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.045418 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.045428 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:11Z","lastTransitionTime":"2025-10-03T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.147344 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.147378 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.147388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.147402 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.147411 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:11Z","lastTransitionTime":"2025-10-03T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.249964 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.250003 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.250011 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.250025 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.250034 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:11Z","lastTransitionTime":"2025-10-03T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.352958 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.352992 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.353001 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.353016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.353025 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:11Z","lastTransitionTime":"2025-10-03T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.455241 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.455278 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.455286 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.455299 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.455308 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:11Z","lastTransitionTime":"2025-10-03T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.557883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.557920 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.557933 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.557949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.557962 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:11Z","lastTransitionTime":"2025-10-03T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.659963 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.659996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.660004 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.660016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.660026 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:11Z","lastTransitionTime":"2025-10-03T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.762144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.762184 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.762193 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.762207 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.762216 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:11Z","lastTransitionTime":"2025-10-03T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.864754 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.864789 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.864798 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.864813 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.864823 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:11Z","lastTransitionTime":"2025-10-03T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.876594 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.876663 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:11 crc kubenswrapper[4835]: E1003 18:15:11.876694 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:11 crc kubenswrapper[4835]: E1003 18:15:11.876866 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.967270 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.967305 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.967313 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.967324 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:11 crc kubenswrapper[4835]: I1003 18:15:11.967333 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:11Z","lastTransitionTime":"2025-10-03T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.070182 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.070220 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.070228 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.070242 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.070252 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:12Z","lastTransitionTime":"2025-10-03T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.172313 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.172362 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.172372 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.172387 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.172396 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:12Z","lastTransitionTime":"2025-10-03T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.275605 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.275667 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.275690 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.275721 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.275741 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:12Z","lastTransitionTime":"2025-10-03T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.378951 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.378987 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.378996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.379009 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.379018 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:12Z","lastTransitionTime":"2025-10-03T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.481644 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.481681 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.481693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.481709 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.481721 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:12Z","lastTransitionTime":"2025-10-03T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.584417 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.584461 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.584473 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.584489 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.584502 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:12Z","lastTransitionTime":"2025-10-03T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.687398 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.687455 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.687473 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.687498 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.687515 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:12Z","lastTransitionTime":"2025-10-03T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.789853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.789909 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.789926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.789949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.789966 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:12Z","lastTransitionTime":"2025-10-03T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.876513 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.876568 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:12 crc kubenswrapper[4835]: E1003 18:15:12.876637 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:12 crc kubenswrapper[4835]: E1003 18:15:12.876688 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.892391 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.892424 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.892433 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.892445 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.892454 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:12Z","lastTransitionTime":"2025-10-03T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.995375 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.995454 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.995476 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.995507 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:12 crc kubenswrapper[4835]: I1003 18:15:12.995530 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:12Z","lastTransitionTime":"2025-10-03T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.097840 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.097963 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.097998 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.098024 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.098041 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:13Z","lastTransitionTime":"2025-10-03T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.199932 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.200052 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.200062 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.200091 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.200100 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:13Z","lastTransitionTime":"2025-10-03T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.302366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.302407 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.302417 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.302429 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.302438 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:13Z","lastTransitionTime":"2025-10-03T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.405085 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.405135 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.405146 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.405159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.405168 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:13Z","lastTransitionTime":"2025-10-03T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.507358 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.507390 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.507401 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.507413 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.507422 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:13Z","lastTransitionTime":"2025-10-03T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.609990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.610023 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.610031 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.610045 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.610055 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:13Z","lastTransitionTime":"2025-10-03T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.712569 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.712610 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.712621 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.712636 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.712648 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:13Z","lastTransitionTime":"2025-10-03T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.815043 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.815094 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.815102 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.815114 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.815123 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:13Z","lastTransitionTime":"2025-10-03T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.876669 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.876704 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:13 crc kubenswrapper[4835]: E1003 18:15:13.876819 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:13 crc kubenswrapper[4835]: E1003 18:15:13.876954 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.918079 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.918137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.918154 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.918171 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:13 crc kubenswrapper[4835]: I1003 18:15:13.918182 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:13Z","lastTransitionTime":"2025-10-03T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.019854 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.019901 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.019916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.019936 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.019954 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.122476 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.122523 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.122532 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.122545 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.122556 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.224472 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.224536 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.224548 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.224563 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.224576 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.327354 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.327411 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.327423 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.327445 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.327458 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.429692 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.429733 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.429744 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.429760 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.429773 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.532356 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.532397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.532405 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.532421 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.532431 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.634972 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.635039 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.635061 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.635131 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.635151 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.638854 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.638903 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.638918 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.638940 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.638959 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: E1003 18:15:14.654680 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.658254 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.658369 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.658450 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.658523 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.658581 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: E1003 18:15:14.672770 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.677190 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.677261 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.677300 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.677330 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.677353 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: E1003 18:15:14.694474 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.698788 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.698842 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.698858 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.698877 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.698892 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: E1003 18:15:14.712668 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.716318 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.716407 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.716476 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.716536 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.716591 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: E1003 18:15:14.726683 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:14 crc kubenswrapper[4835]: E1003 18:15:14.727096 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.737137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.737241 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.737326 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.737424 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.737512 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.839790 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.839824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.839835 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.839850 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.839860 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.876628 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:14 crc kubenswrapper[4835]: E1003 18:15:14.876738 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.876629 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:14 crc kubenswrapper[4835]: E1003 18:15:14.876972 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.941836 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.941883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.941891 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.941904 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:14 crc kubenswrapper[4835]: I1003 18:15:14.941913 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:14Z","lastTransitionTime":"2025-10-03T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.044539 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.044573 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.044581 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.044594 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.044603 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:15Z","lastTransitionTime":"2025-10-03T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.146228 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.146262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.146271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.146284 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.146292 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:15Z","lastTransitionTime":"2025-10-03T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.248615 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.248677 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.248694 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.248717 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.248739 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:15Z","lastTransitionTime":"2025-10-03T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.350992 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.351031 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.351040 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.351053 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.351063 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:15Z","lastTransitionTime":"2025-10-03T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.453625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.453660 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.453669 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.453683 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.453694 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:15Z","lastTransitionTime":"2025-10-03T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.556185 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.556298 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.556321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.556349 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.556372 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:15Z","lastTransitionTime":"2025-10-03T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.658988 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.659055 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.659112 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.659141 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.659163 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:15Z","lastTransitionTime":"2025-10-03T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.761938 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.762315 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.762452 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.762693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.762840 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:15Z","lastTransitionTime":"2025-10-03T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.865671 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.865718 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.865729 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.865747 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.865757 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:15Z","lastTransitionTime":"2025-10-03T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.876002 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.876025 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:15 crc kubenswrapper[4835]: E1003 18:15:15.876148 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:15 crc kubenswrapper[4835]: E1003 18:15:15.876235 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.971744 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.971818 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.971830 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.971846 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:15 crc kubenswrapper[4835]: I1003 18:15:15.971861 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:15Z","lastTransitionTime":"2025-10-03T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.074434 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.074480 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.074491 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.074508 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.074530 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:16Z","lastTransitionTime":"2025-10-03T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.176378 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.176427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.176442 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.176462 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.176477 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:16Z","lastTransitionTime":"2025-10-03T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.278662 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.278697 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.278706 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.278719 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.278729 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:16Z","lastTransitionTime":"2025-10-03T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.381138 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.381174 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.381182 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.381195 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.381205 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:16Z","lastTransitionTime":"2025-10-03T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.483339 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.483398 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.483406 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.483420 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.483431 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:16Z","lastTransitionTime":"2025-10-03T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.585949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.586022 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.586037 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.586055 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.586084 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:16Z","lastTransitionTime":"2025-10-03T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.689220 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.689299 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.689312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.689328 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.689338 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:16Z","lastTransitionTime":"2025-10-03T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.792063 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.792143 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.792159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.792181 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.792198 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:16Z","lastTransitionTime":"2025-10-03T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.876308 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.876334 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:16 crc kubenswrapper[4835]: E1003 18:15:16.876479 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:16 crc kubenswrapper[4835]: E1003 18:15:16.876619 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.895059 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.895134 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.895144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.895159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.895171 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:16Z","lastTransitionTime":"2025-10-03T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.997855 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.997907 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.997923 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.997945 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:16 crc kubenswrapper[4835]: I1003 18:15:16.997963 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:16Z","lastTransitionTime":"2025-10-03T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.100654 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.100692 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.100703 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.100719 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.100729 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:17Z","lastTransitionTime":"2025-10-03T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.202723 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.202758 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.202766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.202779 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.202787 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:17Z","lastTransitionTime":"2025-10-03T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.305125 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.305162 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.305171 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.305185 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.305193 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:17Z","lastTransitionTime":"2025-10-03T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.407342 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.407376 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.407384 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.407397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.407406 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:17Z","lastTransitionTime":"2025-10-03T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.509618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.509664 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.509672 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.509687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.509709 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:17Z","lastTransitionTime":"2025-10-03T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.612858 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.612910 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.612921 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.612938 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.612952 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:17Z","lastTransitionTime":"2025-10-03T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.715525 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.715583 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.715594 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.715610 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.715621 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:17Z","lastTransitionTime":"2025-10-03T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.818338 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.818432 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.818446 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.818461 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.818472 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:17Z","lastTransitionTime":"2025-10-03T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.876795 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.876844 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:17 crc kubenswrapper[4835]: E1003 18:15:17.876944 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:17 crc kubenswrapper[4835]: E1003 18:15:17.877144 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.922108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.922157 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.922185 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.922204 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:17 crc kubenswrapper[4835]: I1003 18:15:17.922219 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:17Z","lastTransitionTime":"2025-10-03T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.024607 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.024652 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.024667 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.024686 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.024700 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:18Z","lastTransitionTime":"2025-10-03T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.127711 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.127753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.127764 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.127781 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.127792 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:18Z","lastTransitionTime":"2025-10-03T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.230697 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.230730 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.230738 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.230753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.230764 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:18Z","lastTransitionTime":"2025-10-03T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.333143 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.333183 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.333191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.333205 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.333215 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:18Z","lastTransitionTime":"2025-10-03T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.436378 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.436430 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.436444 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.436463 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.436476 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:18Z","lastTransitionTime":"2025-10-03T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.540051 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.540197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.540237 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.540277 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.540297 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:18Z","lastTransitionTime":"2025-10-03T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.642841 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.642898 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.642910 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.642926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.642938 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:18Z","lastTransitionTime":"2025-10-03T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.746744 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.746807 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.746817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.746833 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.746846 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:18Z","lastTransitionTime":"2025-10-03T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.849139 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.849176 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.849185 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.849201 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.849211 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:18Z","lastTransitionTime":"2025-10-03T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.875871 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.875928 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:18 crc kubenswrapper[4835]: E1003 18:15:18.876345 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:18 crc kubenswrapper[4835]: E1003 18:15:18.876519 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.877950 4835 scope.go:117] "RemoveContainer" containerID="18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4" Oct 03 18:15:18 crc kubenswrapper[4835]: E1003 18:15:18.878371 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.897301 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5385e85b-313e-4b33-bf09-0b8f5e0a994a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e35992e79cb51b47dd78356feedcca634b95f0fbd0aac49017b88d555dab225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b9902dcf4b7c66e119168c3b3eb90f437ca1e723186a43c648d19f4101b851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aa63760f1ef0079ef65f47e141a5938e22007e5f3bf41a1e61491359b1eb0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:18Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.909510 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:18Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.924348 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:18Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.945044 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:18Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.952676 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.952724 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.952735 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.952757 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.952771 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:18Z","lastTransitionTime":"2025-10-03T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.963641 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:18Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.978671 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:18Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:18 crc kubenswrapper[4835]: I1003 18:15:18.992635 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:18Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.012599 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:19Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.026401 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:19Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.041200 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:19Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.056079 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.056122 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.056133 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.056149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.056163 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:19Z","lastTransitionTime":"2025-10-03T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.056192 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:19Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.074354 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:19Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.092146 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:19Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.105304 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:19Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.132851 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:19Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.144838 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:19Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.158326 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.158378 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.158393 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.158415 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.158428 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:19Z","lastTransitionTime":"2025-10-03T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.161022 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:19Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.209188 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:05Z\\\",\\\"message\\\":\\\"g zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 18:15:05.650934 6487 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 807.71µs\\\\nI1003 18:15:05.650937 6487 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 785.9µs\\\\nI1003 18:15:05.650872 6487 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 18:15:05.650944 6487 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:19Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.261863 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.261929 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.261948 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.261970 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.261983 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:19Z","lastTransitionTime":"2025-10-03T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.364678 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.364721 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.364732 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.364748 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.364759 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:19Z","lastTransitionTime":"2025-10-03T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.467261 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.467300 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.467308 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.467321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.467332 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:19Z","lastTransitionTime":"2025-10-03T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.569479 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.569520 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.569528 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.569541 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.569551 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:19Z","lastTransitionTime":"2025-10-03T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.671545 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.671579 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.671588 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.671652 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.671666 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:19Z","lastTransitionTime":"2025-10-03T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.774043 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.774093 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.774105 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.774119 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.774128 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:19Z","lastTransitionTime":"2025-10-03T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.876144 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.876253 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:19 crc kubenswrapper[4835]: E1003 18:15:19.876373 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.876818 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.876856 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:19 crc kubenswrapper[4835]: E1003 18:15:19.876784 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.876869 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.876927 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.876943 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:19Z","lastTransitionTime":"2025-10-03T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.980262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.980323 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.980335 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.980351 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:19 crc kubenswrapper[4835]: I1003 18:15:19.980361 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:19Z","lastTransitionTime":"2025-10-03T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.083191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.083236 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.083246 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.083258 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.083267 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:20Z","lastTransitionTime":"2025-10-03T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.185421 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.185486 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.185497 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.185533 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.185548 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:20Z","lastTransitionTime":"2025-10-03T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.287322 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.287362 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.287374 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.287391 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.287404 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:20Z","lastTransitionTime":"2025-10-03T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.389949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.390005 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.390036 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.390059 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.390095 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:20Z","lastTransitionTime":"2025-10-03T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.493020 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.493098 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.493112 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.493133 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.493145 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:20Z","lastTransitionTime":"2025-10-03T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.595147 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.595200 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.595213 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.595230 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.595240 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:20Z","lastTransitionTime":"2025-10-03T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.697314 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.697353 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.697363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.697378 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.697389 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:20Z","lastTransitionTime":"2025-10-03T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.799496 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.799534 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.799545 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.799560 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.799570 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:20Z","lastTransitionTime":"2025-10-03T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.877325 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:20 crc kubenswrapper[4835]: E1003 18:15:20.877452 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.877325 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:20 crc kubenswrapper[4835]: E1003 18:15:20.877648 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.901568 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.901614 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.901625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.901642 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:20 crc kubenswrapper[4835]: I1003 18:15:20.901655 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:20Z","lastTransitionTime":"2025-10-03T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.004341 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.004375 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.004385 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.004397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.004406 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:21Z","lastTransitionTime":"2025-10-03T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.106895 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.107171 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.107191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.107208 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.107219 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:21Z","lastTransitionTime":"2025-10-03T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.209337 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.209579 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.209645 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.209716 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.209787 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:21Z","lastTransitionTime":"2025-10-03T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.312467 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.312540 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.312559 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.312590 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.312609 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:21Z","lastTransitionTime":"2025-10-03T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.414872 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.415240 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.415353 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.415428 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.415501 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:21Z","lastTransitionTime":"2025-10-03T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.517797 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.517840 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.517852 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.517869 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.517883 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:21Z","lastTransitionTime":"2025-10-03T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.619911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.619954 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.619967 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.619986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.619999 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:21Z","lastTransitionTime":"2025-10-03T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.721992 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.722267 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.722361 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.722451 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.722528 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:21Z","lastTransitionTime":"2025-10-03T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.825237 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.825284 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.825293 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.825305 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.825316 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:21Z","lastTransitionTime":"2025-10-03T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.876183 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.876184 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:21 crc kubenswrapper[4835]: E1003 18:15:21.876633 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:21 crc kubenswrapper[4835]: E1003 18:15:21.876545 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.926931 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.926993 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.927005 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.927021 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:21 crc kubenswrapper[4835]: I1003 18:15:21.927032 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:21Z","lastTransitionTime":"2025-10-03T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.029093 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.029137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.029149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.029165 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.029176 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:22Z","lastTransitionTime":"2025-10-03T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.134218 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.134728 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.134777 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.134810 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.134834 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:22Z","lastTransitionTime":"2025-10-03T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.236838 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.236881 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.236891 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.236911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.236920 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:22Z","lastTransitionTime":"2025-10-03T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.338439 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.338469 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.338477 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.338489 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.338497 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:22Z","lastTransitionTime":"2025-10-03T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.440383 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.440414 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.440423 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.440435 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.440444 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:22Z","lastTransitionTime":"2025-10-03T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.541895 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.541915 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.541924 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.541936 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.541944 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:22Z","lastTransitionTime":"2025-10-03T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.644404 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.644437 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.644446 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.644460 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.644469 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:22Z","lastTransitionTime":"2025-10-03T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.746795 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.746845 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.746857 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.746877 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.746889 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:22Z","lastTransitionTime":"2025-10-03T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.848968 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.849011 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.849019 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.849033 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.849043 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:22Z","lastTransitionTime":"2025-10-03T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.876497 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.876522 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:22 crc kubenswrapper[4835]: E1003 18:15:22.876627 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:22 crc kubenswrapper[4835]: E1003 18:15:22.876735 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.951647 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.951691 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.951704 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.951721 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:22 crc kubenswrapper[4835]: I1003 18:15:22.951733 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:22Z","lastTransitionTime":"2025-10-03T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.053616 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.053660 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.053668 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.053683 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.053692 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:23Z","lastTransitionTime":"2025-10-03T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.156654 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.156691 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.156704 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.156717 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.156726 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:23Z","lastTransitionTime":"2025-10-03T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.259392 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.259439 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.259454 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.259472 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.259484 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:23Z","lastTransitionTime":"2025-10-03T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.362045 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.362106 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.362118 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.362134 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.362144 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:23Z","lastTransitionTime":"2025-10-03T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.463990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.464039 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.464047 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.464063 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.464097 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:23Z","lastTransitionTime":"2025-10-03T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.566284 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.566324 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.566332 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.566347 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.566357 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:23Z","lastTransitionTime":"2025-10-03T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.668580 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.668617 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.668625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.668637 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.668646 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:23Z","lastTransitionTime":"2025-10-03T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.770843 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.770877 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.770885 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.770899 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.770909 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:23Z","lastTransitionTime":"2025-10-03T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.872701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.872742 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.872750 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.872764 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.872774 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:23Z","lastTransitionTime":"2025-10-03T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.876010 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.876014 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:23 crc kubenswrapper[4835]: E1003 18:15:23.876151 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:23 crc kubenswrapper[4835]: E1003 18:15:23.876278 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.974349 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.974377 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.974385 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.974397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:23 crc kubenswrapper[4835]: I1003 18:15:23.974406 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:23Z","lastTransitionTime":"2025-10-03T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.076939 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.076985 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.076997 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.077010 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.077019 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.179502 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.179536 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.179544 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.179556 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.179565 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.281641 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.281936 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.282099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.282229 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.282317 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.385006 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.385337 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.385426 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.385511 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.385590 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.488108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.488150 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.488202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.488223 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.488236 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.590818 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.590856 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.590865 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.590879 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.590888 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.693039 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.693104 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.693117 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.693144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.693153 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.794878 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.794944 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.794955 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.794993 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.795005 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.876011 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.876117 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:24 crc kubenswrapper[4835]: E1003 18:15:24.876165 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:24 crc kubenswrapper[4835]: E1003 18:15:24.876236 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.896958 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.897002 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.897014 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.897031 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.897042 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.915443 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.915496 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.915509 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.915526 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.915538 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.923973 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:24 crc kubenswrapper[4835]: E1003 18:15:24.924095 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:15:24 crc kubenswrapper[4835]: E1003 18:15:24.924153 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs podName:e2705556-f411-476d-9d8a-78543bae8dc7 nodeName:}" failed. No retries permitted until 2025-10-03 18:15:56.924137652 +0000 UTC m=+98.640078524 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs") pod "network-metrics-daemon-vlmkl" (UID: "e2705556-f411-476d-9d8a-78543bae8dc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:15:24 crc kubenswrapper[4835]: E1003 18:15:24.929992 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:24Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.932892 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.932939 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.932948 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.932962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.932971 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: E1003 18:15:24.943394 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:24Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.945985 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.946008 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.946017 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.946030 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.946039 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: E1003 18:15:24.955901 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:24Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.958840 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.958875 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.958884 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.958897 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.958905 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: E1003 18:15:24.969327 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:24Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.972680 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.972721 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.972735 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.972756 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.972770 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:24 crc kubenswrapper[4835]: E1003 18:15:24.983507 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:24Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:24 crc kubenswrapper[4835]: E1003 18:15:24.983699 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.999400 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.999436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.999444 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.999458 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:24 crc kubenswrapper[4835]: I1003 18:15:24.999468 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:24Z","lastTransitionTime":"2025-10-03T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.101407 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.101437 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.101444 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.101456 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.101465 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:25Z","lastTransitionTime":"2025-10-03T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.203356 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.203876 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.203986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.204124 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.204247 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:25Z","lastTransitionTime":"2025-10-03T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.306986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.307029 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.307038 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.307086 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.307100 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:25Z","lastTransitionTime":"2025-10-03T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.408855 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.408907 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.408931 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.408943 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.408952 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:25Z","lastTransitionTime":"2025-10-03T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.511388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.511462 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.511471 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.511487 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.511496 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:25Z","lastTransitionTime":"2025-10-03T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.613710 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.613749 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.613757 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.613772 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.613784 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:25Z","lastTransitionTime":"2025-10-03T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.716424 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.716469 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.716479 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.716495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.716506 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:25Z","lastTransitionTime":"2025-10-03T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.819380 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.819470 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.819490 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.819521 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.819540 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:25Z","lastTransitionTime":"2025-10-03T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.876887 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.877143 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:25 crc kubenswrapper[4835]: E1003 18:15:25.877361 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:25 crc kubenswrapper[4835]: E1003 18:15:25.877440 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.891540 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.922125 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.922164 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.922173 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.922186 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:25 crc kubenswrapper[4835]: I1003 18:15:25.922196 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:25Z","lastTransitionTime":"2025-10-03T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.025362 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.025436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.025455 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.025510 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.025531 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:26Z","lastTransitionTime":"2025-10-03T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.128335 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.128396 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.128438 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.128465 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.128482 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:26Z","lastTransitionTime":"2025-10-03T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.230914 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.230958 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.230967 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.230987 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.230997 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:26Z","lastTransitionTime":"2025-10-03T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.333740 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.333787 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.333800 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.333817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.333828 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:26Z","lastTransitionTime":"2025-10-03T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.435990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.436033 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.436041 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.436054 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.436064 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:26Z","lastTransitionTime":"2025-10-03T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.538359 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.538389 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.538397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.538412 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.538421 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:26Z","lastTransitionTime":"2025-10-03T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.641021 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.641103 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.641116 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.641139 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.641153 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:26Z","lastTransitionTime":"2025-10-03T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.743456 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.743522 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.743536 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.743557 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.743572 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:26Z","lastTransitionTime":"2025-10-03T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.845875 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.845993 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.846008 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.846057 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.846091 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:26Z","lastTransitionTime":"2025-10-03T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.876750 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.876844 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:26 crc kubenswrapper[4835]: E1003 18:15:26.876907 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:26 crc kubenswrapper[4835]: E1003 18:15:26.877122 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.948210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.948254 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.948263 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.948278 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:26 crc kubenswrapper[4835]: I1003 18:15:26.948289 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:26Z","lastTransitionTime":"2025-10-03T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.050273 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.050314 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.050324 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.050341 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.050353 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:27Z","lastTransitionTime":"2025-10-03T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.152199 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.152233 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.152241 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.152253 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.152263 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:27Z","lastTransitionTime":"2025-10-03T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.222491 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8p9cd_fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93/kube-multus/0.log" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.222541 4835 generic.go:334] "Generic (PLEG): container finished" podID="fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93" containerID="d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33" exitCode=1 Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.222568 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8p9cd" event={"ID":"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93","Type":"ContainerDied","Data":"d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33"} Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.222900 4835 scope.go:117] "RemoveContainer" containerID="d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.234504 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.245408 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.254430 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.254464 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.254473 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.254488 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.254499 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:27Z","lastTransitionTime":"2025-10-03T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.257565 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.267301 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.278046 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.292001 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.304615 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.315927 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.327418 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.338611 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.347691 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.356775 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.356807 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.356816 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.356830 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.356842 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:27Z","lastTransitionTime":"2025-10-03T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.363616 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.373688 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.383482 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:26Z\\\",\\\"message\\\":\\\"2025-10-03T18:14:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d\\\\n2025-10-03T18:14:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d to /host/opt/cni/bin/\\\\n2025-10-03T18:14:41Z [verbose] multus-daemon started\\\\n2025-10-03T18:14:41Z [verbose] Readiness Indicator file check\\\\n2025-10-03T18:15:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.398269 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:05Z\\\",\\\"message\\\":\\\"g zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 18:15:05.650934 6487 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 807.71µs\\\\nI1003 18:15:05.650937 6487 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 785.9µs\\\\nI1003 18:15:05.650872 6487 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 18:15:05.650944 6487 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.406957 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.415406 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b788715-74fe-4091-ad81-675af7bc1519\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6bb20070b73452498da6c6a6f79e01551a0d203cc7d85f39bc13b9e68482be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.423725 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5385e85b-313e-4b33-bf09-0b8f5e0a994a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e35992e79cb51b47dd78356feedcca634b95f0fbd0aac49017b88d555dab225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b9902dcf4b7c66e119168c3b3eb90f437ca1e723186a43c648d19f4101b851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aa63760f1ef0079ef65f47e141a5938e22007e5f3bf41a1e61491359b1eb0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.431718 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:27Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.459168 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.459196 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.459206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.459222 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.459234 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:27Z","lastTransitionTime":"2025-10-03T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.561614 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.561641 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.561649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.561662 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.561671 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:27Z","lastTransitionTime":"2025-10-03T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.664206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.664243 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.664255 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.664272 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.664283 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:27Z","lastTransitionTime":"2025-10-03T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.766411 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.766444 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.766452 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.766468 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.766477 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:27Z","lastTransitionTime":"2025-10-03T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.868243 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.868283 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.868292 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.868304 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.868313 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:27Z","lastTransitionTime":"2025-10-03T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.876471 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.876566 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:27 crc kubenswrapper[4835]: E1003 18:15:27.876676 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:27 crc kubenswrapper[4835]: E1003 18:15:27.876825 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.970373 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.970411 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.970419 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.970432 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:27 crc kubenswrapper[4835]: I1003 18:15:27.970441 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:27Z","lastTransitionTime":"2025-10-03T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.072651 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.072692 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.072700 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.072714 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.072723 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:28Z","lastTransitionTime":"2025-10-03T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.175886 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.175925 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.175935 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.175949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.175959 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:28Z","lastTransitionTime":"2025-10-03T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.227251 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8p9cd_fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93/kube-multus/0.log" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.227306 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8p9cd" event={"ID":"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93","Type":"ContainerStarted","Data":"12ccf52445e391368af99975592bd8f1206e9a136c9bc04732839082fcaecde1"} Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.241528 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.253810 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.265145 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.275334 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.277940 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.277968 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.277976 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.277990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.278000 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:28Z","lastTransitionTime":"2025-10-03T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.285403 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.298217 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.310396 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.322814 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.333730 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.343355 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.355646 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.374802 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.381397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.381425 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.381432 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.381445 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.381454 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:28Z","lastTransitionTime":"2025-10-03T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.384188 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.394924 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ccf52445e391368af99975592bd8f1206e9a136c9bc04732839082fcaecde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:26Z\\\",\\\"message\\\":\\\"2025-10-03T18:14:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d\\\\n2025-10-03T18:14:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d to /host/opt/cni/bin/\\\\n2025-10-03T18:14:41Z [verbose] multus-daemon started\\\\n2025-10-03T18:14:41Z [verbose] Readiness Indicator file check\\\\n2025-10-03T18:15:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.421109 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:05Z\\\",\\\"message\\\":\\\"g zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 18:15:05.650934 6487 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 807.71µs\\\\nI1003 18:15:05.650937 6487 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 785.9µs\\\\nI1003 18:15:05.650872 6487 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 18:15:05.650944 6487 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.430360 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.438208 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b788715-74fe-4091-ad81-675af7bc1519\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6bb20070b73452498da6c6a6f79e01551a0d203cc7d85f39bc13b9e68482be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.447204 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5385e85b-313e-4b33-bf09-0b8f5e0a994a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e35992e79cb51b47dd78356feedcca634b95f0fbd0aac49017b88d555dab225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b9902dcf4b7c66e119168c3b3eb90f437ca1e723186a43c648d19f4101b851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aa63760f1ef0079ef65f47e141a5938e22007e5f3bf41a1e61491359b1eb0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.455026 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.483708 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.483745 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.483753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.483766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.483776 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:28Z","lastTransitionTime":"2025-10-03T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.586263 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.586307 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.586355 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.586375 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.586388 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:28Z","lastTransitionTime":"2025-10-03T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.688721 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.688769 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.688779 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.688793 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.688803 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:28Z","lastTransitionTime":"2025-10-03T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.791382 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.791427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.791436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.791451 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.791461 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:28Z","lastTransitionTime":"2025-10-03T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.876365 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.876371 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:28 crc kubenswrapper[4835]: E1003 18:15:28.876488 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:28 crc kubenswrapper[4835]: E1003 18:15:28.876587 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.886793 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b788715-74fe-4091-ad81-675af7bc1519\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6bb20070b73452498da6c6a6f79e01551a0d203cc7d85f39bc13b9e68482be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.893218 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.893245 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.893253 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.893266 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.893274 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:28Z","lastTransitionTime":"2025-10-03T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.897199 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5385e85b-313e-4b33-bf09-0b8f5e0a994a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e35992e79cb51b47dd78356feedcca634b95f0fbd0aac49017b88d555dab225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b9902dcf4b7c66e119168c3b3eb90f437ca1e723186a43c648d19f4101b851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aa63760f1ef0079ef65f47e141a5938e22007e5f3bf41a1e61491359b1eb0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.906141 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.917926 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.927772 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.940959 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.950878 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.963034 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.976489 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.987256 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.995418 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.995460 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.995469 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.995481 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:28 crc kubenswrapper[4835]: I1003 18:15:28.995491 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:28Z","lastTransitionTime":"2025-10-03T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.000287 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:28Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.010186 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:29Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.022232 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:29Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.031685 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:29Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.048884 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:29Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.058191 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:29Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.068979 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ccf52445e391368af99975592bd8f1206e9a136c9bc04732839082fcaecde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:26Z\\\",\\\"message\\\":\\\"2025-10-03T18:14:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d\\\\n2025-10-03T18:14:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d to /host/opt/cni/bin/\\\\n2025-10-03T18:14:41Z [verbose] multus-daemon started\\\\n2025-10-03T18:14:41Z [verbose] Readiness Indicator file check\\\\n2025-10-03T18:15:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:29Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.084841 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:05Z\\\",\\\"message\\\":\\\"g zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 18:15:05.650934 6487 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 807.71µs\\\\nI1003 18:15:05.650937 6487 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 785.9µs\\\\nI1003 18:15:05.650872 6487 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 18:15:05.650944 6487 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:29Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.093336 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:29Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.097731 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.097765 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.097774 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.097788 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.097797 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:29Z","lastTransitionTime":"2025-10-03T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.199934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.199974 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.199987 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.200003 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.200014 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:29Z","lastTransitionTime":"2025-10-03T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.302520 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.302567 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.302577 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.302591 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.302602 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:29Z","lastTransitionTime":"2025-10-03T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.404501 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.404538 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.404546 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.404561 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.404570 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:29Z","lastTransitionTime":"2025-10-03T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.506492 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.506531 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.506543 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.506557 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.506567 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:29Z","lastTransitionTime":"2025-10-03T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.608669 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.608719 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.608730 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.608748 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.608760 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:29Z","lastTransitionTime":"2025-10-03T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.711210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.711251 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.711259 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.711274 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.711284 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:29Z","lastTransitionTime":"2025-10-03T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.813666 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.813722 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.813736 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.813753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.813764 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:29Z","lastTransitionTime":"2025-10-03T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.876486 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.876543 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:29 crc kubenswrapper[4835]: E1003 18:15:29.876645 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:29 crc kubenswrapper[4835]: E1003 18:15:29.876749 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.916297 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.916343 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.916357 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.916373 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:29 crc kubenswrapper[4835]: I1003 18:15:29.916384 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:29Z","lastTransitionTime":"2025-10-03T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.018965 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.018996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.019004 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.019188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.019199 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:30Z","lastTransitionTime":"2025-10-03T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.121759 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.121808 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.121821 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.121839 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.121851 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:30Z","lastTransitionTime":"2025-10-03T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.224155 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.224188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.224195 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.224208 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.224218 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:30Z","lastTransitionTime":"2025-10-03T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.326335 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.326367 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.326375 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.326389 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.326399 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:30Z","lastTransitionTime":"2025-10-03T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.429085 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.429123 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.429131 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.429144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.429153 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:30Z","lastTransitionTime":"2025-10-03T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.531417 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.531457 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.531466 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.531483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.531495 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:30Z","lastTransitionTime":"2025-10-03T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.634358 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.634427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.634436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.634448 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.634456 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:30Z","lastTransitionTime":"2025-10-03T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.736799 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.736870 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.736883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.736897 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.736907 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:30Z","lastTransitionTime":"2025-10-03T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.838971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.839011 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.839022 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.839036 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.839047 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:30Z","lastTransitionTime":"2025-10-03T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.875933 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.876057 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:30 crc kubenswrapper[4835]: E1003 18:15:30.876093 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:30 crc kubenswrapper[4835]: E1003 18:15:30.876222 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.941564 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.941599 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.941607 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.941621 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:30 crc kubenswrapper[4835]: I1003 18:15:30.941630 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:30Z","lastTransitionTime":"2025-10-03T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.044190 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.044232 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.044244 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.044260 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.044271 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:31Z","lastTransitionTime":"2025-10-03T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.145997 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.146050 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.146060 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.146095 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.146104 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:31Z","lastTransitionTime":"2025-10-03T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.248285 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.248353 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.248365 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.248382 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.248394 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:31Z","lastTransitionTime":"2025-10-03T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.349822 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.349853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.349862 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.349874 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.349884 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:31Z","lastTransitionTime":"2025-10-03T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.451982 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.452014 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.452022 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.452035 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.452045 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:31Z","lastTransitionTime":"2025-10-03T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.554178 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.554214 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.554223 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.554236 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.554245 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:31Z","lastTransitionTime":"2025-10-03T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.656530 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.656574 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.656585 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.656601 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.656612 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:31Z","lastTransitionTime":"2025-10-03T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.758992 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.759034 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.759044 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.759059 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.759090 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:31Z","lastTransitionTime":"2025-10-03T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.861654 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.861701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.861714 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.861729 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.861739 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:31Z","lastTransitionTime":"2025-10-03T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.876562 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.876614 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:31 crc kubenswrapper[4835]: E1003 18:15:31.876917 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:31 crc kubenswrapper[4835]: E1003 18:15:31.877096 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.877253 4835 scope.go:117] "RemoveContainer" containerID="18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.963463 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.963504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.963516 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.963537 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:31 crc kubenswrapper[4835]: I1003 18:15:31.963547 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:31Z","lastTransitionTime":"2025-10-03T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.065131 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.065189 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.065199 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.065214 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.065223 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:32Z","lastTransitionTime":"2025-10-03T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.167349 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.167400 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.167412 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.167429 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.167441 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:32Z","lastTransitionTime":"2025-10-03T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.239713 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/2.log" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.242679 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerStarted","Data":"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0"} Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.243117 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.256057 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.264417 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.270136 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.270179 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.270192 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.270207 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.270216 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:32Z","lastTransitionTime":"2025-10-03T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.279438 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.295910 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.320204 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.336485 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.346968 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.360848 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.371171 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.372397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.372447 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.372456 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.372470 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.372496 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:32Z","lastTransitionTime":"2025-10-03T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.384117 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.396947 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.409572 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ccf52445e391368af99975592bd8f1206e9a136c9bc04732839082fcaecde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:26Z\\\",\\\"message\\\":\\\"2025-10-03T18:14:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d\\\\n2025-10-03T18:14:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d to /host/opt/cni/bin/\\\\n2025-10-03T18:14:41Z [verbose] multus-daemon started\\\\n2025-10-03T18:14:41Z [verbose] Readiness Indicator file check\\\\n2025-10-03T18:15:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.430603 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:05Z\\\",\\\"message\\\":\\\"g zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 18:15:05.650934 6487 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 807.71µs\\\\nI1003 18:15:05.650937 6487 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 785.9µs\\\\nI1003 18:15:05.650872 6487 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 18:15:05.650944 6487 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.440546 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.463010 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.474349 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.475310 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.475353 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.475364 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.475378 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.475387 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:32Z","lastTransitionTime":"2025-10-03T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.487411 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.497489 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b788715-74fe-4091-ad81-675af7bc1519\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6bb20070b73452498da6c6a6f79e01551a0d203cc7d85f39bc13b9e68482be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.508568 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5385e85b-313e-4b33-bf09-0b8f5e0a994a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e35992e79cb51b47dd78356feedcca634b95f0fbd0aac49017b88d555dab225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b9902dcf4b7c66e119168c3b3eb90f437ca1e723186a43c648d19f4101b851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aa63760f1ef0079ef65f47e141a5938e22007e5f3bf41a1e61491359b1eb0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.577975 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.578012 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.578020 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.578033 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.578041 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:32Z","lastTransitionTime":"2025-10-03T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.680934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.680967 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.680979 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.680993 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.681002 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:32Z","lastTransitionTime":"2025-10-03T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.783468 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.783495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.783503 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.783514 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.783522 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:32Z","lastTransitionTime":"2025-10-03T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.878142 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:32 crc kubenswrapper[4835]: E1003 18:15:32.878240 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.878389 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:32 crc kubenswrapper[4835]: E1003 18:15:32.878436 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.885167 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.885197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.885208 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.885222 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.885233 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:32Z","lastTransitionTime":"2025-10-03T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.987908 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.987935 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.987942 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.987989 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:32 crc kubenswrapper[4835]: I1003 18:15:32.987998 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:32Z","lastTransitionTime":"2025-10-03T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.089785 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.089812 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.089820 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.089832 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.089840 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:33Z","lastTransitionTime":"2025-10-03T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.191993 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.192033 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.192043 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.192061 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.192100 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:33Z","lastTransitionTime":"2025-10-03T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.247276 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/3.log" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.247832 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/2.log" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.250156 4835 generic.go:334] "Generic (PLEG): container finished" podID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerID="9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0" exitCode=1 Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.250194 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0"} Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.250236 4835 scope.go:117] "RemoveContainer" containerID="18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.250913 4835 scope.go:117] "RemoveContainer" containerID="9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0" Oct 03 18:15:33 crc kubenswrapper[4835]: E1003 18:15:33.251352 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.271814 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.281443 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.293417 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ccf52445e391368af99975592bd8f1206e9a136c9bc04732839082fcaecde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:26Z\\\",\\\"message\\\":\\\"2025-10-03T18:14:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d\\\\n2025-10-03T18:14:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d to /host/opt/cni/bin/\\\\n2025-10-03T18:14:41Z [verbose] multus-daemon started\\\\n2025-10-03T18:14:41Z [verbose] Readiness Indicator file check\\\\n2025-10-03T18:15:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.294389 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.294448 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.294459 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.294496 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.294508 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:33Z","lastTransitionTime":"2025-10-03T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.310974 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18c77ce3c0efba87b4cde7abc3ecf37e65d63b4ad3d403691beb2bb5e51a3de4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:05Z\\\",\\\"message\\\":\\\"g zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 18:15:05.650934 6487 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 807.71µs\\\\nI1003 18:15:05.650937 6487 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 785.9µs\\\\nI1003 18:15:05.650872 6487 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 18:15:05.650944 6487 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:32Z\\\",\\\"message\\\":\\\"al_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console-operator/metrics]} name:Service_openshift-console-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.88:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ebd4748e-0473-49fb-88ad-83dbb221791a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 18:15:32.673732 6864 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console-operator/metrics]} name:Service_openshift-console-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.88:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ebd4748e-0473-49fb-88ad-83dbb221791a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1003 18:15:32.673767 6864 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.320968 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.329549 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b788715-74fe-4091-ad81-675af7bc1519\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6bb20070b73452498da6c6a6f79e01551a0d203cc7d85f39bc13b9e68482be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.340482 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5385e85b-313e-4b33-bf09-0b8f5e0a994a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e35992e79cb51b47dd78356feedcca634b95f0fbd0aac49017b88d555dab225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b9902dcf4b7c66e119168c3b3eb90f437ca1e723186a43c648d19f4101b851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aa63760f1ef0079ef65f47e141a5938e22007e5f3bf41a1e61491359b1eb0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.352175 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.364735 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.376347 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.389444 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.397860 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.397915 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.397926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.397941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.397952 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:33Z","lastTransitionTime":"2025-10-03T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.400753 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.412022 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.423041 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.435421 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.445845 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.455283 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.467463 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.476161 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.500612 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.500643 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.500652 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.500664 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.500691 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:33Z","lastTransitionTime":"2025-10-03T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.603136 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.603168 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.603177 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.603191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.603201 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:33Z","lastTransitionTime":"2025-10-03T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.705090 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.705139 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.705148 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.705162 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.705170 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:33Z","lastTransitionTime":"2025-10-03T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.807789 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.807827 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.807836 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.807849 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.807858 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:33Z","lastTransitionTime":"2025-10-03T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.876751 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.876769 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:33 crc kubenswrapper[4835]: E1003 18:15:33.877042 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:33 crc kubenswrapper[4835]: E1003 18:15:33.876894 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.910131 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.910159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.910167 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.910179 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:33 crc kubenswrapper[4835]: I1003 18:15:33.910188 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:33Z","lastTransitionTime":"2025-10-03T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.012185 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.012214 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.012222 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.012235 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.012244 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:34Z","lastTransitionTime":"2025-10-03T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.114104 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.114144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.114157 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.114173 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.114184 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:34Z","lastTransitionTime":"2025-10-03T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.215768 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.215803 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.215813 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.215830 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.215843 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:34Z","lastTransitionTime":"2025-10-03T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.255243 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/3.log" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.258415 4835 scope.go:117] "RemoveContainer" containerID="9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0" Oct 03 18:15:34 crc kubenswrapper[4835]: E1003 18:15:34.258585 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.267904 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b788715-74fe-4091-ad81-675af7bc1519\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6bb20070b73452498da6c6a6f79e01551a0d203cc7d85f39bc13b9e68482be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.277611 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5385e85b-313e-4b33-bf09-0b8f5e0a994a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e35992e79cb51b47dd78356feedcca634b95f0fbd0aac49017b88d555dab225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b9902dcf4b7c66e119168c3b3eb90f437ca1e723186a43c648d19f4101b851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aa63760f1ef0079ef65f47e141a5938e22007e5f3bf41a1e61491359b1eb0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.285873 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.297475 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.308700 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.317172 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.317202 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.317210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.317222 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.317231 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:34Z","lastTransitionTime":"2025-10-03T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.319912 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.329802 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.340113 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.353913 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.363907 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.373596 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.384766 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.395618 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.405573 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.413943 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.418885 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.418916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.418926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.418939 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.418947 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:34Z","lastTransitionTime":"2025-10-03T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.430249 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.438625 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.448744 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ccf52445e391368af99975592bd8f1206e9a136c9bc04732839082fcaecde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:26Z\\\",\\\"message\\\":\\\"2025-10-03T18:14:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d\\\\n2025-10-03T18:14:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d to /host/opt/cni/bin/\\\\n2025-10-03T18:14:41Z [verbose] multus-daemon started\\\\n2025-10-03T18:14:41Z [verbose] Readiness Indicator file check\\\\n2025-10-03T18:15:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.462947 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:32Z\\\",\\\"message\\\":\\\"al_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console-operator/metrics]} name:Service_openshift-console-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.88:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ebd4748e-0473-49fb-88ad-83dbb221791a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 18:15:32.673732 6864 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console-operator/metrics]} name:Service_openshift-console-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.88:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ebd4748e-0473-49fb-88ad-83dbb221791a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1003 18:15:32.673767 6864 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:34Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.520596 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.520622 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.520634 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.520649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.520659 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:34Z","lastTransitionTime":"2025-10-03T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.622776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.622810 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.622817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.622829 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.622838 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:34Z","lastTransitionTime":"2025-10-03T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.725313 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.725636 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.725722 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.725800 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.725886 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:34Z","lastTransitionTime":"2025-10-03T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.828197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.828236 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.828247 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.828263 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.828273 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:34Z","lastTransitionTime":"2025-10-03T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.877097 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.877105 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:34 crc kubenswrapper[4835]: E1003 18:15:34.877197 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:34 crc kubenswrapper[4835]: E1003 18:15:34.877405 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.930471 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.930514 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.930525 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.930542 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.930553 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:34Z","lastTransitionTime":"2025-10-03T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.993226 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.993266 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.993275 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.993288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:34 crc kubenswrapper[4835]: I1003 18:15:34.993298 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:34Z","lastTransitionTime":"2025-10-03T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: E1003 18:15:35.004786 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.007739 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.007795 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.007808 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.007826 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.007839 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: E1003 18:15:35.019172 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.022464 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.022590 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.022651 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.022744 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.022819 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: E1003 18:15:35.035062 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.038191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.038333 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.038398 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.038475 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.038540 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: E1003 18:15:35.049116 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.052485 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.052520 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.052528 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.052541 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.052551 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: E1003 18:15:35.063300 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:35 crc kubenswrapper[4835]: E1003 18:15:35.063408 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.064755 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.064784 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.064793 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.064807 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.064834 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.167727 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.167765 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.167774 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.167787 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.167796 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.269953 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.270005 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.270020 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.270036 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.270048 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.372580 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.372618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.372629 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.372644 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.372657 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.474363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.474403 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.474411 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.474426 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.474436 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.576373 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.576409 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.576419 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.576434 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.576444 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.678374 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.678408 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.678417 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.678429 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.678438 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.780670 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.780738 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.780765 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.780779 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.780788 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.876582 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.876593 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:35 crc kubenswrapper[4835]: E1003 18:15:35.876717 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:35 crc kubenswrapper[4835]: E1003 18:15:35.876838 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.882950 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.882972 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.882982 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.882993 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.883005 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.984761 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.984826 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.984837 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.984851 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:35 crc kubenswrapper[4835]: I1003 18:15:35.984860 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:35Z","lastTransitionTime":"2025-10-03T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.087230 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.087258 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.087267 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.087282 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.087309 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:36Z","lastTransitionTime":"2025-10-03T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.189796 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.189836 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.189848 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.189866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.189879 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:36Z","lastTransitionTime":"2025-10-03T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.291465 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.291506 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.291518 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.291534 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.291545 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:36Z","lastTransitionTime":"2025-10-03T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.394118 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.394158 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.394167 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.394183 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.394193 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:36Z","lastTransitionTime":"2025-10-03T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.495988 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.496023 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.496031 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.496043 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.496052 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:36Z","lastTransitionTime":"2025-10-03T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.598687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.598984 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.598995 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.599011 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.599021 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:36Z","lastTransitionTime":"2025-10-03T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.701170 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.701220 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.701231 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.701246 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.701257 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:36Z","lastTransitionTime":"2025-10-03T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.803576 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.803660 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.803672 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.803692 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.803705 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:36Z","lastTransitionTime":"2025-10-03T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.876469 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.876528 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:36 crc kubenswrapper[4835]: E1003 18:15:36.876604 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:36 crc kubenswrapper[4835]: E1003 18:15:36.876687 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.905568 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.905597 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.905606 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.905619 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:36 crc kubenswrapper[4835]: I1003 18:15:36.905632 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:36Z","lastTransitionTime":"2025-10-03T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.008600 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.009175 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.009188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.009205 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.009229 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:37Z","lastTransitionTime":"2025-10-03T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.111594 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.111650 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.111663 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.111681 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.111691 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:37Z","lastTransitionTime":"2025-10-03T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.214100 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.214135 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.214143 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.214172 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.214181 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:37Z","lastTransitionTime":"2025-10-03T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.316422 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.316458 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.316469 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.316485 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.316496 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:37Z","lastTransitionTime":"2025-10-03T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.418551 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.418600 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.418611 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.418627 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.418636 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:37Z","lastTransitionTime":"2025-10-03T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.520614 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.520651 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.520659 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.520675 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.520684 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:37Z","lastTransitionTime":"2025-10-03T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.622928 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.622983 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.622992 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.623005 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.623016 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:37Z","lastTransitionTime":"2025-10-03T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.724898 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.724940 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.724950 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.724973 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.724984 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:37Z","lastTransitionTime":"2025-10-03T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.827146 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.827185 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.827201 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.827216 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.827227 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:37Z","lastTransitionTime":"2025-10-03T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.876351 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.876435 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:37 crc kubenswrapper[4835]: E1003 18:15:37.876486 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:37 crc kubenswrapper[4835]: E1003 18:15:37.876583 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.929774 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.929814 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.929822 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.929836 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:37 crc kubenswrapper[4835]: I1003 18:15:37.929849 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:37Z","lastTransitionTime":"2025-10-03T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.032336 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.032367 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.032377 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.032390 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.032401 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:38Z","lastTransitionTime":"2025-10-03T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.134543 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.134580 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.134589 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.134604 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.134614 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:38Z","lastTransitionTime":"2025-10-03T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.237653 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.237690 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.237699 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.237714 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.237724 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:38Z","lastTransitionTime":"2025-10-03T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.340372 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.340452 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.340469 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.340499 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.340518 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:38Z","lastTransitionTime":"2025-10-03T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.442355 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.442713 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.442826 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.442962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.443039 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:38Z","lastTransitionTime":"2025-10-03T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.545927 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.546199 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.546229 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.546250 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.546263 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:38Z","lastTransitionTime":"2025-10-03T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.648818 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.648883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.648898 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.648918 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.648930 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:38Z","lastTransitionTime":"2025-10-03T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.751228 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.751271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.751279 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.751293 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.751302 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:38Z","lastTransitionTime":"2025-10-03T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.853439 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.853507 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.853518 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.853535 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.853547 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:38Z","lastTransitionTime":"2025-10-03T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.875782 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:38 crc kubenswrapper[4835]: E1003 18:15:38.875871 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.875793 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:38 crc kubenswrapper[4835]: E1003 18:15:38.876000 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.887310 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.897295 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b788715-74fe-4091-ad81-675af7bc1519\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6bb20070b73452498da6c6a6f79e01551a0d203cc7d85f39bc13b9e68482be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.907721 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5385e85b-313e-4b33-bf09-0b8f5e0a994a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e35992e79cb51b47dd78356feedcca634b95f0fbd0aac49017b88d555dab225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b9902dcf4b7c66e119168c3b3eb90f437ca1e723186a43c648d19f4101b851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aa63760f1ef0079ef65f47e141a5938e22007e5f3bf41a1e61491359b1eb0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.917705 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.927936 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.940714 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.955178 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.955532 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.955618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.955626 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.955639 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.955649 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:38Z","lastTransitionTime":"2025-10-03T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.965508 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.980301 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:38 crc kubenswrapper[4835]: I1003 18:15:38.990494 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:38Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.008361 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.018348 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.030357 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.044290 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.057236 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ccf52445e391368af99975592bd8f1206e9a136c9bc04732839082fcaecde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:26Z\\\",\\\"message\\\":\\\"2025-10-03T18:14:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d\\\\n2025-10-03T18:14:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d to /host/opt/cni/bin/\\\\n2025-10-03T18:14:41Z [verbose] multus-daemon started\\\\n2025-10-03T18:14:41Z [verbose] Readiness Indicator file check\\\\n2025-10-03T18:15:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.057744 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.057781 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.057793 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.057814 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.057827 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:39Z","lastTransitionTime":"2025-10-03T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.076058 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:32Z\\\",\\\"message\\\":\\\"al_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console-operator/metrics]} name:Service_openshift-console-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.88:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ebd4748e-0473-49fb-88ad-83dbb221791a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 18:15:32.673732 6864 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console-operator/metrics]} name:Service_openshift-console-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.88:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ebd4748e-0473-49fb-88ad-83dbb221791a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1003 18:15:32.673767 6864 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.085504 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.107296 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.117578 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:39Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.160387 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.160432 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.160443 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.160459 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.160471 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:39Z","lastTransitionTime":"2025-10-03T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.262400 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.262442 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.262454 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.262470 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.262483 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:39Z","lastTransitionTime":"2025-10-03T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.365016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.365047 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.365056 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.365090 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.365100 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:39Z","lastTransitionTime":"2025-10-03T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.466558 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.466592 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.466599 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.466613 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.466621 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:39Z","lastTransitionTime":"2025-10-03T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.569062 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.569114 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.569123 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.569136 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.569146 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:39Z","lastTransitionTime":"2025-10-03T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.671325 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.671356 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.671363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.671375 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.671395 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:39Z","lastTransitionTime":"2025-10-03T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.773654 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.773683 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.773693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.773707 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.773716 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:39Z","lastTransitionTime":"2025-10-03T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.876091 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.876108 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:39 crc kubenswrapper[4835]: E1003 18:15:39.876321 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:39 crc kubenswrapper[4835]: E1003 18:15:39.876425 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.876612 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.876640 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.876650 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.876663 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.876673 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:39Z","lastTransitionTime":"2025-10-03T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.978613 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.978670 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.978682 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.978703 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:39 crc kubenswrapper[4835]: I1003 18:15:39.978720 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:39Z","lastTransitionTime":"2025-10-03T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.081137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.081170 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.081178 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.081190 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.081198 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:40Z","lastTransitionTime":"2025-10-03T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.184100 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.184257 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.184301 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.184322 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.184342 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:40Z","lastTransitionTime":"2025-10-03T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.286962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.287001 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.287013 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.287027 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.287038 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:40Z","lastTransitionTime":"2025-10-03T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.389281 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.389333 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.389347 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.389366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.389381 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:40Z","lastTransitionTime":"2025-10-03T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.491031 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.491087 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.491099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.491117 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.491129 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:40Z","lastTransitionTime":"2025-10-03T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.593498 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.593535 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.593543 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.593559 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.593569 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:40Z","lastTransitionTime":"2025-10-03T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.696105 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.696367 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.696441 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.696531 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.696605 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:40Z","lastTransitionTime":"2025-10-03T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.798635 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.798677 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.798686 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.798702 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.798711 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:40Z","lastTransitionTime":"2025-10-03T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.876899 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.876929 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:40 crc kubenswrapper[4835]: E1003 18:15:40.877175 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:40 crc kubenswrapper[4835]: E1003 18:15:40.877307 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.901350 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.901393 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.901407 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.901424 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:40 crc kubenswrapper[4835]: I1003 18:15:40.901436 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:40Z","lastTransitionTime":"2025-10-03T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.003123 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.003169 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.003182 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.003198 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.003207 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:41Z","lastTransitionTime":"2025-10-03T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.105426 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.105650 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.105711 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.105781 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.105840 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:41Z","lastTransitionTime":"2025-10-03T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.207804 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.207837 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.207847 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.207861 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.207869 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:41Z","lastTransitionTime":"2025-10-03T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.309712 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.309744 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.309753 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.309767 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.309776 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:41Z","lastTransitionTime":"2025-10-03T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.411549 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.411585 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.411593 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.411608 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.411618 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:41Z","lastTransitionTime":"2025-10-03T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.513624 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.513653 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.513664 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.513676 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.513685 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:41Z","lastTransitionTime":"2025-10-03T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.616159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.616200 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.616211 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.616227 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.616239 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:41Z","lastTransitionTime":"2025-10-03T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.686959 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.687312 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.687275182 +0000 UTC m=+147.403216084 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.718855 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.718896 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.718906 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.718921 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.718932 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:41Z","lastTransitionTime":"2025-10-03T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.788434 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.788468 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.788504 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.788522 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.788560 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.788605 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.78859136 +0000 UTC m=+147.504532232 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.788622 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.788633 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.788643 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.788670 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.788662172 +0000 UTC m=+147.504603044 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.788709 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.788719 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.788725 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.788726 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.788753 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.788747475 +0000 UTC m=+147.504688347 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.788831 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.788811137 +0000 UTC m=+147.504752009 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.820726 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.820769 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.820778 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.820794 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.820807 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:41Z","lastTransitionTime":"2025-10-03T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.876587 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.876601 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.876703 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:41 crc kubenswrapper[4835]: E1003 18:15:41.876809 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.922419 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.922508 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.922543 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.922557 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:41 crc kubenswrapper[4835]: I1003 18:15:41.922566 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:41Z","lastTransitionTime":"2025-10-03T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.024768 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.024945 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.025037 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.025171 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.025259 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:42Z","lastTransitionTime":"2025-10-03T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.127694 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.127733 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.127741 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.127756 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.127765 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:42Z","lastTransitionTime":"2025-10-03T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.229790 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.230050 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.230060 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.230092 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.230101 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:42Z","lastTransitionTime":"2025-10-03T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.331590 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.331618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.331625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.331638 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.331647 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:42Z","lastTransitionTime":"2025-10-03T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.433852 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.433883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.433900 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.433918 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.433935 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:42Z","lastTransitionTime":"2025-10-03T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.535712 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.535756 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.535768 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.535784 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.535795 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:42Z","lastTransitionTime":"2025-10-03T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.637485 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.637517 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.637530 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.637547 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.637558 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:42Z","lastTransitionTime":"2025-10-03T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.739484 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.739521 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.739532 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.739545 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.739554 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:42Z","lastTransitionTime":"2025-10-03T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.841442 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.841474 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.841482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.841495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.841503 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:42Z","lastTransitionTime":"2025-10-03T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.876170 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.876223 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:42 crc kubenswrapper[4835]: E1003 18:15:42.876293 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:42 crc kubenswrapper[4835]: E1003 18:15:42.876410 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.943269 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.943303 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.943312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.943327 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:42 crc kubenswrapper[4835]: I1003 18:15:42.943339 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:42Z","lastTransitionTime":"2025-10-03T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.045708 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.045757 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.045765 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.045779 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.045789 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:43Z","lastTransitionTime":"2025-10-03T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.147839 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.148122 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.148210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.148308 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.148375 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:43Z","lastTransitionTime":"2025-10-03T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.250521 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.250557 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.250565 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.250580 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.250590 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:43Z","lastTransitionTime":"2025-10-03T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.352353 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.352382 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.352389 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.352402 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.352411 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:43Z","lastTransitionTime":"2025-10-03T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.454802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.454840 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.454848 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.454865 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.454874 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:43Z","lastTransitionTime":"2025-10-03T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.557559 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.557591 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.557599 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.557611 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.557619 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:43Z","lastTransitionTime":"2025-10-03T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.659921 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.659981 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.659993 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.660014 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.660026 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:43Z","lastTransitionTime":"2025-10-03T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.762209 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.762250 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.762262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.762278 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.762289 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:43Z","lastTransitionTime":"2025-10-03T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.865195 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.865240 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.865250 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.865266 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.865277 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:43Z","lastTransitionTime":"2025-10-03T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.876898 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.877014 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:43 crc kubenswrapper[4835]: E1003 18:15:43.877052 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:43 crc kubenswrapper[4835]: E1003 18:15:43.877295 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.967565 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.967600 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.967609 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.967624 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:43 crc kubenswrapper[4835]: I1003 18:15:43.967633 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:43Z","lastTransitionTime":"2025-10-03T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.070204 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.070247 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.070257 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.070271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.070282 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:44Z","lastTransitionTime":"2025-10-03T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.172908 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.172966 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.172977 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.172996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.173010 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:44Z","lastTransitionTime":"2025-10-03T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.275195 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.275243 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.275252 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.275267 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.275276 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:44Z","lastTransitionTime":"2025-10-03T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.378308 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.378875 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.378964 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.379108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.379200 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:44Z","lastTransitionTime":"2025-10-03T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.481842 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.482293 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.482376 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.482472 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.482644 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:44Z","lastTransitionTime":"2025-10-03T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.585963 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.586301 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.586377 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.586445 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.586522 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:44Z","lastTransitionTime":"2025-10-03T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.688963 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.688997 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.689006 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.689018 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.689027 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:44Z","lastTransitionTime":"2025-10-03T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.791412 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.791452 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.791466 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.791483 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.791495 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:44Z","lastTransitionTime":"2025-10-03T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.876416 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:44 crc kubenswrapper[4835]: E1003 18:15:44.876531 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.876717 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:44 crc kubenswrapper[4835]: E1003 18:15:44.876872 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.893848 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.893880 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.893888 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.893901 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.893910 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:44Z","lastTransitionTime":"2025-10-03T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.996023 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.996083 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.996093 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.996109 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:44 crc kubenswrapper[4835]: I1003 18:15:44.996118 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:44Z","lastTransitionTime":"2025-10-03T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.098086 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.098122 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.098134 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.098148 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.098158 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.200371 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.200429 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.200442 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.200460 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.200473 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.301764 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.301808 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.301822 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.301840 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.301851 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.403931 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.403977 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.403995 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.404014 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.404044 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.439137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.439169 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.439179 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.439190 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.439199 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: E1003 18:15:45.451378 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.455148 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.455176 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.455186 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.455201 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.455212 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: E1003 18:15:45.464904 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.467819 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.467847 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.467855 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.467866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.467875 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: E1003 18:15:45.477644 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.480618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.480658 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.480672 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.480695 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.480709 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: E1003 18:15:45.492499 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.495259 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.495310 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.495319 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.495333 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.495342 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: E1003 18:15:45.506677 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"09a7fe10-d48b-4c2b-a983-f4d4d5c8e340\\\",\\\"systemUUID\\\":\\\"5536f758-9b73-4d0a-adbf-baceea025860\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:45 crc kubenswrapper[4835]: E1003 18:15:45.506792 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.508364 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.508392 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.508400 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.508415 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.508426 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.610495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.610540 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.610553 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.610572 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.610585 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.712810 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.712855 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.712866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.712882 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.712892 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.815687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.815728 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.815740 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.815756 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.815766 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.875927 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.875942 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:45 crc kubenswrapper[4835]: E1003 18:15:45.876028 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:45 crc kubenswrapper[4835]: E1003 18:15:45.876155 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.917351 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.917386 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.917403 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.917418 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:45 crc kubenswrapper[4835]: I1003 18:15:45.917432 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:45Z","lastTransitionTime":"2025-10-03T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.019290 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.019333 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.019342 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.019358 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.019366 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:46Z","lastTransitionTime":"2025-10-03T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.121979 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.122224 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.122295 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.122371 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.122439 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:46Z","lastTransitionTime":"2025-10-03T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.224747 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.225002 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.225143 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.225251 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.225366 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:46Z","lastTransitionTime":"2025-10-03T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.328106 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.328140 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.328148 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.328162 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.328171 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:46Z","lastTransitionTime":"2025-10-03T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.430352 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.430406 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.430422 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.430445 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.430460 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:46Z","lastTransitionTime":"2025-10-03T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.532765 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.532805 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.532819 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.532835 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.532846 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:46Z","lastTransitionTime":"2025-10-03T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.634572 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.634887 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.634978 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.635127 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.635232 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:46Z","lastTransitionTime":"2025-10-03T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.737709 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.737950 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.738029 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.738127 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.738200 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:46Z","lastTransitionTime":"2025-10-03T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.840702 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.840755 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.840770 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.840791 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.840806 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:46Z","lastTransitionTime":"2025-10-03T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.875876 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.875928 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:46 crc kubenswrapper[4835]: E1003 18:15:46.876623 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:46 crc kubenswrapper[4835]: E1003 18:15:46.876768 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.943046 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.943095 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.943107 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.943119 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:46 crc kubenswrapper[4835]: I1003 18:15:46.943127 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:46Z","lastTransitionTime":"2025-10-03T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.045961 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.046015 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.046026 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.046044 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.046055 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:47Z","lastTransitionTime":"2025-10-03T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.148777 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.148826 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.148844 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.148865 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.148882 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:47Z","lastTransitionTime":"2025-10-03T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.251960 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.251997 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.252008 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.252023 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.252035 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:47Z","lastTransitionTime":"2025-10-03T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.354047 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.354102 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.354143 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.354159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.354170 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:47Z","lastTransitionTime":"2025-10-03T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.456279 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.456342 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.456352 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.456366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.456378 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:47Z","lastTransitionTime":"2025-10-03T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.558223 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.558255 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.558265 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.558277 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.558286 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:47Z","lastTransitionTime":"2025-10-03T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.661080 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.661121 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.661146 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.661160 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.661169 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:47Z","lastTransitionTime":"2025-10-03T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.763305 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.763361 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.763372 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.763392 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.763405 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:47Z","lastTransitionTime":"2025-10-03T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.865145 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.865182 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.865191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.865205 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.865216 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:47Z","lastTransitionTime":"2025-10-03T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.876486 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.876526 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:47 crc kubenswrapper[4835]: E1003 18:15:47.876643 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:47 crc kubenswrapper[4835]: E1003 18:15:47.876714 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.877838 4835 scope.go:117] "RemoveContainer" containerID="9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0" Oct 03 18:15:47 crc kubenswrapper[4835]: E1003 18:15:47.878047 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.967819 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.967869 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.967880 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.967895 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:47 crc kubenswrapper[4835]: I1003 18:15:47.967905 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:47Z","lastTransitionTime":"2025-10-03T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.070377 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.070420 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.070428 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.070446 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.070456 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:48Z","lastTransitionTime":"2025-10-03T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.172596 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.172633 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.172642 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.172656 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.172666 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:48Z","lastTransitionTime":"2025-10-03T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.275025 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.275084 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.275093 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.275108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.275116 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:48Z","lastTransitionTime":"2025-10-03T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.376936 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.376984 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.376997 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.377013 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.377025 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:48Z","lastTransitionTime":"2025-10-03T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.479256 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.479300 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.479312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.479329 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.479343 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:48Z","lastTransitionTime":"2025-10-03T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.581391 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.581448 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.581462 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.581480 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.581490 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:48Z","lastTransitionTime":"2025-10-03T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.684080 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.684121 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.684130 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.684144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.684154 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:48Z","lastTransitionTime":"2025-10-03T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.787358 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.787427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.787450 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.787482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.787509 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:48Z","lastTransitionTime":"2025-10-03T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.876475 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:48 crc kubenswrapper[4835]: E1003 18:15:48.876602 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.876672 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:48 crc kubenswrapper[4835]: E1003 18:15:48.877122 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.889563 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.889602 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.889611 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.889623 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.889632 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:48Z","lastTransitionTime":"2025-10-03T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.891177 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b788715-74fe-4091-ad81-675af7bc1519\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b6bb20070b73452498da6c6a6f79e01551a0d203cc7d85f39bc13b9e68482be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9a57eeae506e1ab2be03594b387919f0733cb6b9ffb11c44f22da78ba7f1c60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.903183 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5385e85b-313e-4b33-bf09-0b8f5e0a994a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e35992e79cb51b47dd78356feedcca634b95f0fbd0aac49017b88d555dab225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b9902dcf4b7c66e119168c3b3eb90f437ca1e723186a43c648d19f4101b851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4aa63760f1ef0079ef65f47e141a5938e22007e5f3bf41a1e61491359b1eb0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3198288f7f387536c676ce1c251db2f7d9e5ea935c241951aadb4bcf6ca32bcf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.915862 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2705556-f411-476d-9d8a-78543bae8dc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm88q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vlmkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.928039 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3afdb36-83d3-4860-86d4-203d1ce896dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f628d064bfafb7e95d0246022aa4d232fd0ad68be089356faed8e4e9e2cfc25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b0ec0f0aa8da438e80aa080ec5d73a525ab72d746b7ef994a8bac33199abc3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e0c928015275ed5f1ade4df2061766ebba0f42977b9b4f8613feeb6a30d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c2a82c69cbde0f3d90aaf41cd1e7604c808263c6ff3dba1e7f1ca00c327b125\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6add821116f1d2bd1e38f9c741d2b27cf7ab2df1ae5324050342d9f959346d3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW1003 18:14:37.154002 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 18:14:37.154229 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 18:14:37.156719 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4283832143/tls.crt::/tmp/serving-cert-4283832143/tls.key\\\\\\\"\\\\nI1003 18:14:37.628808 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 18:14:37.632601 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 18:14:37.632621 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 18:14:37.632638 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 18:14:37.632643 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 18:14:37.643425 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 18:14:37.643463 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643484 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 18:14:37.643490 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 18:14:37.643494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 18:14:37.643498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 18:14:37.643502 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1003 18:14:37.643696 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1003 18:14:37.647148 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba216544ddf7f103ea197084fe031602dae979b0bb0d74ee130a49faf2189b0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e039cbc826225ea893e72c573bbeaed7225993ec7dbdcb1238acae4b324f063\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.944875 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e258ae25375830603a8c1a754d63eb235c2a61d79dfbc079fda264c78989629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42141191608c2cb70104f36861149b068f9fb62cb97ca2104aaa4337179d4fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.959700 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.971249 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1782f6fb-6c25-419c-914a-9f88c72af1bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c916e1dde33662b3b16ca3b6d00439ff316686d098322f333a4d85b9f84b69b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43341b750cd629bcf639cd7c477aff515a08324c9696487c6bdd76da96a4724b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnmm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n7z6j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.983493 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8529a508f0aaeec34506889457a4e4cfe7efe5f317d5a2674fbd96c66025e50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.991408 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.991444 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.991453 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.991473 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:48 crc kubenswrapper[4835]: I1003 18:15:48.991483 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:48Z","lastTransitionTime":"2025-10-03T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.000398 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c53bc2-068f-4b54-9a51-1eee44a03e59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07d25ccdafd7a995a59720725af6bf576165a98adf7705ba06dac9985b7f6bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61e6439891ac13ed9cfca8d63bdf4460275a6b8ba901e1ccaa580bcbd0e4c284\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7796fcff0ad264b8398e5d4e3e21d92f228cf8e36f46f55bfa16faf29a0aba73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ec2949b413de2a4d6408f6ee8b2051edb5df46e363e1a910d7df9ceb0595b85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d026cc52bd3da72c9c756e1f46ac441770e6c01ef3a4b2104775c224297c59e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d1f8114f554a4bb05b810ce87c18b5538630509a1089a2f0f1db123ed3cb333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d44db27b21cbed32356ec91c0ca56da878a6e4461ed3bd837c13f0f84a7e863\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjhfs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dzgvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:48Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.010642 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f335cbaf72736d3d417616fedb78c30d67ea62dcd0bb2efc5c5eee90956c397f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rgs5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w4fql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.025211 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46515a87-0ee0-4c95-9591-efed938cd552\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b48e37eac6854a49956bc5f9c6ccdf3a8f116fecc31a54aa7b6181ecbd77c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e818cb68cc3f52ee8331eaeec7e9bae7098f47c87f8513134785a6049bcb40a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ac42f3623fe4349c718bfbe86f70c65edf8893e414a1a2f1130453dc3c5da11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d0325ed36f8fe38efb469feeb03835dbf4750ecf04e77ad900ad4f11564a43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.036437 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55097560f6dbcb0706d49f18a072fe9f11cdc22539502bb512bafdbcaf0b1ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.046386 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.056521 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.067364 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zsch7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e63fda18-1d89-4268-aa9a-9e04f6c1539e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd1513e7033dc818412896741fdfa5c1a6fa6cea05bf2e16cd64ef31930771c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t2844\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zsch7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.084503 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ed6886-6bc2-4075-9b95-49efa498dbc4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd6c9b489af699f4658e1d4002239b7d20c7f3237a964c772dee407341987bf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a39a14838fba0151881fc1934ef07a1a182000ecdc7733f1a6528b4ae7d5ce82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e192ce109b291d9bf9a495c319f487c7b20c8bf023bd549fb528407e5680e02e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5372367a83bc43128267e9e0f12206a89f4537a518b4de99fd9cd98c1f9434c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b46d7414d8d6b0d7b30f6c56d5602e40d661a90a84ec72933e7cb7b45450d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2e0078d9a94af993fa1d9a380be3714df7f5d15aab591ab875de73f5c75ca4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3adce5904e38f273855f6e71b687bda72c83101b003eb6bbe5be5d46502e8046\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c5349ad170e606896e39fd05b9d116d7b87dfabc6501b1d9ea6f9f58b5cb50b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.093802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.093829 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.093856 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.093870 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.093878 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:49Z","lastTransitionTime":"2025-10-03T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.094401 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-4x78q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2af8bc4b-5145-400f-a847-ef393bd84601\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06e5c725308a92131fc85f0ba13bccb541cd4577cb37a9746966ccc51079c1f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ht675\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-4x78q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.105368 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8p9cd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12ccf52445e391368af99975592bd8f1206e9a136c9bc04732839082fcaecde1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:26Z\\\",\\\"message\\\":\\\"2025-10-03T18:14:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d\\\\n2025-10-03T18:14:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_97c1c8cc-26ee-4f7c-a050-ff4d91cafa9d to /host/opt/cni/bin/\\\\n2025-10-03T18:14:41Z [verbose] multus-daemon started\\\\n2025-10-03T18:14:41Z [verbose] Readiness Indicator file check\\\\n2025-10-03T18:15:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fpncv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8p9cd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.126639 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T18:15:32Z\\\",\\\"message\\\":\\\"al_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console-operator/metrics]} name:Service_openshift-console-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.88:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ebd4748e-0473-49fb-88ad-83dbb221791a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 18:15:32.673732 6864 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console-operator/metrics]} name:Service_openshift-console-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.88:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ebd4748e-0473-49fb-88ad-83dbb221791a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1003 18:15:32.673767 6864 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T18:15:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T18:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T18:14:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T18:14:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9z72\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T18:14:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p2w8j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T18:15:49Z is after 2025-08-24T17:21:41Z" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.196701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.196736 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.196748 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.196764 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.196775 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:49Z","lastTransitionTime":"2025-10-03T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.298638 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.298701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.298710 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.298723 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.298732 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:49Z","lastTransitionTime":"2025-10-03T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.401543 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.401595 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.401608 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.401626 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.401639 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:49Z","lastTransitionTime":"2025-10-03T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.503715 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.503750 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.503759 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.503774 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.503782 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:49Z","lastTransitionTime":"2025-10-03T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.605840 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.605873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.605881 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.605893 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.605901 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:49Z","lastTransitionTime":"2025-10-03T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.707925 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.707963 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.707971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.707985 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.707996 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:49Z","lastTransitionTime":"2025-10-03T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.809868 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.809910 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.809920 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.809934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.809944 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:49Z","lastTransitionTime":"2025-10-03T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.876023 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.876093 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:49 crc kubenswrapper[4835]: E1003 18:15:49.876192 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:49 crc kubenswrapper[4835]: E1003 18:15:49.876299 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.911749 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.911779 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.911787 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.911802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:49 crc kubenswrapper[4835]: I1003 18:15:49.911813 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:49Z","lastTransitionTime":"2025-10-03T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.013783 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.013817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.013826 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.013838 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.013849 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:50Z","lastTransitionTime":"2025-10-03T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.115971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.116006 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.116014 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.116027 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.116035 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:50Z","lastTransitionTime":"2025-10-03T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.217890 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.217928 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.217940 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.217955 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.217966 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:50Z","lastTransitionTime":"2025-10-03T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.320272 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.320312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.320320 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.320333 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.320343 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:50Z","lastTransitionTime":"2025-10-03T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.422259 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.422300 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.422312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.422325 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.422337 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:50Z","lastTransitionTime":"2025-10-03T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.524124 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.524163 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.524172 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.524186 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.524195 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:50Z","lastTransitionTime":"2025-10-03T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.626437 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.626469 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.626477 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.626493 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.626510 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:50Z","lastTransitionTime":"2025-10-03T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.728858 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.728894 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.728903 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.728916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.728924 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:50Z","lastTransitionTime":"2025-10-03T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.830889 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.830934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.830945 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.830960 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.830970 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:50Z","lastTransitionTime":"2025-10-03T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.876687 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.876763 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:50 crc kubenswrapper[4835]: E1003 18:15:50.876793 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:50 crc kubenswrapper[4835]: E1003 18:15:50.876897 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.932624 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.932660 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.932671 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.932685 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:50 crc kubenswrapper[4835]: I1003 18:15:50.932697 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:50Z","lastTransitionTime":"2025-10-03T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.034880 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.034913 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.034926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.034964 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.034976 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:51Z","lastTransitionTime":"2025-10-03T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.137896 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.137955 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.137968 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.137990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.138004 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:51Z","lastTransitionTime":"2025-10-03T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.240543 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.240866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.240991 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.241109 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.241246 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:51Z","lastTransitionTime":"2025-10-03T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.344615 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.344663 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.344690 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.344707 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.344719 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:51Z","lastTransitionTime":"2025-10-03T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.447941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.447996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.448009 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.448028 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.448042 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:51Z","lastTransitionTime":"2025-10-03T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.551096 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.551149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.551182 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.551198 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.551207 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:51Z","lastTransitionTime":"2025-10-03T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.653926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.653986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.653995 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.654009 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.654019 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:51Z","lastTransitionTime":"2025-10-03T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.756547 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.756599 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.756612 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.756630 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.756642 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:51Z","lastTransitionTime":"2025-10-03T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.860933 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.860974 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.860984 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.860999 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.861019 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:51Z","lastTransitionTime":"2025-10-03T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.876379 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.876428 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:51 crc kubenswrapper[4835]: E1003 18:15:51.876540 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:51 crc kubenswrapper[4835]: E1003 18:15:51.876594 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.963603 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.963635 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.963644 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.963657 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:51 crc kubenswrapper[4835]: I1003 18:15:51.963666 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:51Z","lastTransitionTime":"2025-10-03T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.066032 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.066089 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.066102 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.066119 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.066130 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:52Z","lastTransitionTime":"2025-10-03T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.167564 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.167604 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.167618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.167634 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.167644 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:52Z","lastTransitionTime":"2025-10-03T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.270155 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.270184 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.270191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.270204 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.270213 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:52Z","lastTransitionTime":"2025-10-03T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.371934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.371967 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.371978 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.371991 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.372000 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:52Z","lastTransitionTime":"2025-10-03T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.474553 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.474594 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.474602 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.474616 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.474626 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:52Z","lastTransitionTime":"2025-10-03T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.576864 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.576900 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.576909 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.576922 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.576930 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:52Z","lastTransitionTime":"2025-10-03T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.678927 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.678971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.678981 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.678997 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.679007 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:52Z","lastTransitionTime":"2025-10-03T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.781763 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.781834 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.781856 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.781893 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.781913 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:52Z","lastTransitionTime":"2025-10-03T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.876989 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.877103 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:52 crc kubenswrapper[4835]: E1003 18:15:52.877391 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:52 crc kubenswrapper[4835]: E1003 18:15:52.877500 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.883493 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.883647 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.883746 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.883834 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.883928 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:52Z","lastTransitionTime":"2025-10-03T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.986817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.986905 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.986935 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.986974 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:52 crc kubenswrapper[4835]: I1003 18:15:52.987034 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:52Z","lastTransitionTime":"2025-10-03T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.090000 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.090134 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.090158 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.090189 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.090236 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:53Z","lastTransitionTime":"2025-10-03T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.193396 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.193494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.193516 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.193558 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.193585 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:53Z","lastTransitionTime":"2025-10-03T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.296861 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.296949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.296982 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.297017 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.297046 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:53Z","lastTransitionTime":"2025-10-03T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.400331 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.400395 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.400411 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.400438 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.400457 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:53Z","lastTransitionTime":"2025-10-03T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.506251 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.506310 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.506328 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.506350 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.506366 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:53Z","lastTransitionTime":"2025-10-03T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.607932 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.607962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.607971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.607983 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.607991 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:53Z","lastTransitionTime":"2025-10-03T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.710644 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.710681 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.710693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.710709 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.710721 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:53Z","lastTransitionTime":"2025-10-03T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.812931 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.812978 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.812991 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.813007 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.813019 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:53Z","lastTransitionTime":"2025-10-03T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.876557 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.876568 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:53 crc kubenswrapper[4835]: E1003 18:15:53.876714 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:53 crc kubenswrapper[4835]: E1003 18:15:53.876824 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.915741 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.915786 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.915795 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.915809 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:53 crc kubenswrapper[4835]: I1003 18:15:53.915818 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:53Z","lastTransitionTime":"2025-10-03T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.018303 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.018354 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.018366 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.018384 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.018398 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:54Z","lastTransitionTime":"2025-10-03T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.121368 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.121412 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.121425 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.121439 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.121450 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:54Z","lastTransitionTime":"2025-10-03T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.223997 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.224082 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.224099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.224119 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.224131 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:54Z","lastTransitionTime":"2025-10-03T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.326272 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.326309 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.326319 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.326335 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.326346 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:54Z","lastTransitionTime":"2025-10-03T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.428536 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.428629 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.428648 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.428681 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.428700 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:54Z","lastTransitionTime":"2025-10-03T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.532328 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.532412 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.532431 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.532461 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.532481 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:54Z","lastTransitionTime":"2025-10-03T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.635440 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.635494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.635505 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.635521 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.635530 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:54Z","lastTransitionTime":"2025-10-03T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.738087 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.738151 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.738163 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.738181 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.738191 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:54Z","lastTransitionTime":"2025-10-03T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.841403 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.841485 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.841503 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.841526 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.841543 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:54Z","lastTransitionTime":"2025-10-03T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.875912 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:54 crc kubenswrapper[4835]: E1003 18:15:54.876113 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.876318 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:54 crc kubenswrapper[4835]: E1003 18:15:54.876439 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.943999 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.944271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.944348 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.944434 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:54 crc kubenswrapper[4835]: I1003 18:15:54.944726 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:54Z","lastTransitionTime":"2025-10-03T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.047013 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.047052 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.047061 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.047090 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.047100 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:55Z","lastTransitionTime":"2025-10-03T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.149901 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.150255 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.150321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.150413 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.150484 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:55Z","lastTransitionTime":"2025-10-03T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.252286 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.252344 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.252353 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.252365 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.252373 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:55Z","lastTransitionTime":"2025-10-03T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.355361 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.355407 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.355417 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.355431 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.355443 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:55Z","lastTransitionTime":"2025-10-03T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.458087 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.458126 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.458135 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.458151 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.458194 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:55Z","lastTransitionTime":"2025-10-03T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.560977 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.561009 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.561020 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.561034 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.561043 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:55Z","lastTransitionTime":"2025-10-03T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.663147 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.663190 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.663198 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.663215 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.663259 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:55Z","lastTransitionTime":"2025-10-03T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.765190 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.765235 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.765243 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.765263 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.765274 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:55Z","lastTransitionTime":"2025-10-03T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.766741 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.766801 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.766817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.766839 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.766858 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T18:15:55Z","lastTransitionTime":"2025-10-03T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.806825 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd"] Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.807210 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.808861 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.808861 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.809062 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.809423 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.820138 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=30.820109522 podStartE2EDuration="30.820109522s" podCreationTimestamp="2025-10-03 18:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:15:55.819697331 +0000 UTC m=+97.535638213" watchObservedRunningTime="2025-10-03 18:15:55.820109522 +0000 UTC m=+97.536050424" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.824899 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5df9c850-0ee9-4412-acad-653eb079faf7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.824959 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5df9c850-0ee9-4412-acad-653eb079faf7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.824985 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5df9c850-0ee9-4412-acad-653eb079faf7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.825033 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5df9c850-0ee9-4412-acad-653eb079faf7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.825092 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5df9c850-0ee9-4412-acad-653eb079faf7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.832880 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.832862275 podStartE2EDuration="48.832862275s" podCreationTimestamp="2025-10-03 18:15:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:15:55.831593911 +0000 UTC m=+97.547534783" watchObservedRunningTime="2025-10-03 18:15:55.832862275 +0000 UTC m=+97.548803147" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.859116 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.859097842 podStartE2EDuration="1m18.859097842s" podCreationTimestamp="2025-10-03 18:14:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:15:55.858791434 +0000 UTC m=+97.574732306" watchObservedRunningTime="2025-10-03 18:15:55.859097842 +0000 UTC m=+97.575038714" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.875986 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.876275 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:55 crc kubenswrapper[4835]: E1003 18:15:55.876425 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:55 crc kubenswrapper[4835]: E1003 18:15:55.876616 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.912369 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n7z6j" podStartSLOduration=76.912352106 podStartE2EDuration="1m16.912352106s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:15:55.9009987 +0000 UTC m=+97.616939572" watchObservedRunningTime="2025-10-03 18:15:55.912352106 +0000 UTC m=+97.628292978" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.925630 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5df9c850-0ee9-4412-acad-653eb079faf7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.925686 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5df9c850-0ee9-4412-acad-653eb079faf7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.925725 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5df9c850-0ee9-4412-acad-653eb079faf7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.925749 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5df9c850-0ee9-4412-acad-653eb079faf7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.925763 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5df9c850-0ee9-4412-acad-653eb079faf7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.925852 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5df9c850-0ee9-4412-acad-653eb079faf7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.925893 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5df9c850-0ee9-4412-acad-653eb079faf7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.926593 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5df9c850-0ee9-4412-acad-653eb079faf7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.930769 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=77.930753381 podStartE2EDuration="1m17.930753381s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:15:55.930528436 +0000 UTC m=+97.646469328" watchObservedRunningTime="2025-10-03 18:15:55.930753381 +0000 UTC m=+97.646694253" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.931551 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5df9c850-0ee9-4412-acad-653eb079faf7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.942718 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5df9c850-0ee9-4412-acad-653eb079faf7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sxgvd\" (UID: \"5df9c850-0ee9-4412-acad-653eb079faf7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:55 crc kubenswrapper[4835]: I1003 18:15:55.992051 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dzgvb" podStartSLOduration=77.992036242 podStartE2EDuration="1m17.992036242s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:15:55.991265041 +0000 UTC m=+97.707205933" watchObservedRunningTime="2025-10-03 18:15:55.992036242 +0000 UTC m=+97.707977114" Oct 03 18:15:56 crc kubenswrapper[4835]: I1003 18:15:56.000960 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podStartSLOduration=78.000944842 podStartE2EDuration="1m18.000944842s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:15:56.000814258 +0000 UTC m=+97.716755140" watchObservedRunningTime="2025-10-03 18:15:56.000944842 +0000 UTC m=+97.716885704" Oct 03 18:15:56 crc kubenswrapper[4835]: I1003 18:15:56.024345 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=76.024327182 podStartE2EDuration="1m16.024327182s" podCreationTimestamp="2025-10-03 18:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:15:56.023460148 +0000 UTC m=+97.739401020" watchObservedRunningTime="2025-10-03 18:15:56.024327182 +0000 UTC m=+97.740268054" Oct 03 18:15:56 crc kubenswrapper[4835]: I1003 18:15:56.033116 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4x78q" podStartSLOduration=78.033093108 podStartE2EDuration="1m18.033093108s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:15:56.03280228 +0000 UTC m=+97.748743172" watchObservedRunningTime="2025-10-03 18:15:56.033093108 +0000 UTC m=+97.749033980" Oct 03 18:15:56 crc kubenswrapper[4835]: I1003 18:15:56.045667 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8p9cd" podStartSLOduration=78.045650396 podStartE2EDuration="1m18.045650396s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:15:56.045610925 +0000 UTC m=+97.761551807" watchObservedRunningTime="2025-10-03 18:15:56.045650396 +0000 UTC m=+97.761591268" Oct 03 18:15:56 crc kubenswrapper[4835]: I1003 18:15:56.076109 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zsch7" podStartSLOduration=78.076091266 podStartE2EDuration="1m18.076091266s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:15:56.075349675 +0000 UTC m=+97.791290577" watchObservedRunningTime="2025-10-03 18:15:56.076091266 +0000 UTC m=+97.792032138" Oct 03 18:15:56 crc kubenswrapper[4835]: I1003 18:15:56.119662 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" Oct 03 18:15:56 crc kubenswrapper[4835]: W1003 18:15:56.131665 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5df9c850_0ee9_4412_acad_653eb079faf7.slice/crio-7ccbc390104f4222dbf6cdcfcfedf23d6250c3965c6eea3ec826385f9287ddfb WatchSource:0}: Error finding container 7ccbc390104f4222dbf6cdcfcfedf23d6250c3965c6eea3ec826385f9287ddfb: Status 404 returned error can't find the container with id 7ccbc390104f4222dbf6cdcfcfedf23d6250c3965c6eea3ec826385f9287ddfb Oct 03 18:15:56 crc kubenswrapper[4835]: I1003 18:15:56.316051 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" event={"ID":"5df9c850-0ee9-4412-acad-653eb079faf7","Type":"ContainerStarted","Data":"53c4daafc87964e35c4845a22e17623e4385a3f32bf470bb278d1b7d580eb8a6"} Oct 03 18:15:56 crc kubenswrapper[4835]: I1003 18:15:56.316363 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" event={"ID":"5df9c850-0ee9-4412-acad-653eb079faf7","Type":"ContainerStarted","Data":"7ccbc390104f4222dbf6cdcfcfedf23d6250c3965c6eea3ec826385f9287ddfb"} Oct 03 18:15:56 crc kubenswrapper[4835]: I1003 18:15:56.875972 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:56 crc kubenswrapper[4835]: E1003 18:15:56.876118 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:56 crc kubenswrapper[4835]: I1003 18:15:56.876213 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:56 crc kubenswrapper[4835]: E1003 18:15:56.876345 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:56 crc kubenswrapper[4835]: I1003 18:15:56.934485 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:56 crc kubenswrapper[4835]: E1003 18:15:56.934618 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:15:56 crc kubenswrapper[4835]: E1003 18:15:56.934671 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs podName:e2705556-f411-476d-9d8a-78543bae8dc7 nodeName:}" failed. No retries permitted until 2025-10-03 18:17:00.934657378 +0000 UTC m=+162.650598250 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs") pod "network-metrics-daemon-vlmkl" (UID: "e2705556-f411-476d-9d8a-78543bae8dc7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 18:15:57 crc kubenswrapper[4835]: I1003 18:15:57.876534 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:57 crc kubenswrapper[4835]: E1003 18:15:57.876650 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:57 crc kubenswrapper[4835]: I1003 18:15:57.876534 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:57 crc kubenswrapper[4835]: E1003 18:15:57.876812 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:15:58 crc kubenswrapper[4835]: I1003 18:15:58.875862 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:15:58 crc kubenswrapper[4835]: I1003 18:15:58.875912 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:15:58 crc kubenswrapper[4835]: E1003 18:15:58.877029 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:15:58 crc kubenswrapper[4835]: E1003 18:15:58.877256 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:15:59 crc kubenswrapper[4835]: I1003 18:15:59.876806 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:15:59 crc kubenswrapper[4835]: E1003 18:15:59.876940 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:15:59 crc kubenswrapper[4835]: I1003 18:15:59.877186 4835 scope.go:117] "RemoveContainer" containerID="9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0" Oct 03 18:15:59 crc kubenswrapper[4835]: I1003 18:15:59.877293 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:15:59 crc kubenswrapper[4835]: E1003 18:15:59.877401 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" Oct 03 18:15:59 crc kubenswrapper[4835]: E1003 18:15:59.877499 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:00 crc kubenswrapper[4835]: I1003 18:16:00.876431 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:00 crc kubenswrapper[4835]: E1003 18:16:00.876542 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:00 crc kubenswrapper[4835]: I1003 18:16:00.876431 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:00 crc kubenswrapper[4835]: E1003 18:16:00.876614 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:01 crc kubenswrapper[4835]: I1003 18:16:01.876348 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:01 crc kubenswrapper[4835]: I1003 18:16:01.876372 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:01 crc kubenswrapper[4835]: E1003 18:16:01.876488 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:01 crc kubenswrapper[4835]: E1003 18:16:01.876602 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:02 crc kubenswrapper[4835]: I1003 18:16:02.876547 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:02 crc kubenswrapper[4835]: I1003 18:16:02.877285 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:02 crc kubenswrapper[4835]: E1003 18:16:02.877499 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:02 crc kubenswrapper[4835]: E1003 18:16:02.877644 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:03 crc kubenswrapper[4835]: I1003 18:16:03.876603 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:03 crc kubenswrapper[4835]: E1003 18:16:03.876714 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:03 crc kubenswrapper[4835]: I1003 18:16:03.876606 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:03 crc kubenswrapper[4835]: E1003 18:16:03.876887 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:04 crc kubenswrapper[4835]: I1003 18:16:04.876849 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:04 crc kubenswrapper[4835]: I1003 18:16:04.876995 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:04 crc kubenswrapper[4835]: E1003 18:16:04.877047 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:04 crc kubenswrapper[4835]: E1003 18:16:04.876954 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:05 crc kubenswrapper[4835]: I1003 18:16:05.876312 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:05 crc kubenswrapper[4835]: E1003 18:16:05.876695 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:05 crc kubenswrapper[4835]: I1003 18:16:05.876360 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:05 crc kubenswrapper[4835]: E1003 18:16:05.876898 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:06 crc kubenswrapper[4835]: I1003 18:16:06.876523 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:06 crc kubenswrapper[4835]: E1003 18:16:06.876661 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:06 crc kubenswrapper[4835]: I1003 18:16:06.876528 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:06 crc kubenswrapper[4835]: E1003 18:16:06.876897 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:07 crc kubenswrapper[4835]: I1003 18:16:07.876197 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:07 crc kubenswrapper[4835]: I1003 18:16:07.876401 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:07 crc kubenswrapper[4835]: E1003 18:16:07.876490 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:07 crc kubenswrapper[4835]: E1003 18:16:07.876638 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:08 crc kubenswrapper[4835]: I1003 18:16:08.877060 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:08 crc kubenswrapper[4835]: E1003 18:16:08.877607 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:08 crc kubenswrapper[4835]: I1003 18:16:08.877274 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:08 crc kubenswrapper[4835]: E1003 18:16:08.877789 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:09 crc kubenswrapper[4835]: I1003 18:16:09.876236 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:09 crc kubenswrapper[4835]: E1003 18:16:09.876469 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:09 crc kubenswrapper[4835]: I1003 18:16:09.876837 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:09 crc kubenswrapper[4835]: E1003 18:16:09.876953 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:10 crc kubenswrapper[4835]: I1003 18:16:10.877355 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:10 crc kubenswrapper[4835]: I1003 18:16:10.878136 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:10 crc kubenswrapper[4835]: E1003 18:16:10.878298 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:10 crc kubenswrapper[4835]: E1003 18:16:10.878465 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:11 crc kubenswrapper[4835]: I1003 18:16:11.876094 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:11 crc kubenswrapper[4835]: I1003 18:16:11.876123 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:11 crc kubenswrapper[4835]: E1003 18:16:11.876509 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:11 crc kubenswrapper[4835]: I1003 18:16:11.876621 4835 scope.go:117] "RemoveContainer" containerID="9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0" Oct 03 18:16:11 crc kubenswrapper[4835]: E1003 18:16:11.876621 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:11 crc kubenswrapper[4835]: E1003 18:16:11.876738 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p2w8j_openshift-ovn-kubernetes(48bbeb2a-b75a-4650-b5ea-b180b8c0168a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" Oct 03 18:16:12 crc kubenswrapper[4835]: I1003 18:16:12.875868 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:12 crc kubenswrapper[4835]: E1003 18:16:12.875990 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:12 crc kubenswrapper[4835]: I1003 18:16:12.876049 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:12 crc kubenswrapper[4835]: E1003 18:16:12.876283 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:13 crc kubenswrapper[4835]: I1003 18:16:13.365272 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8p9cd_fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93/kube-multus/1.log" Oct 03 18:16:13 crc kubenswrapper[4835]: I1003 18:16:13.365980 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8p9cd_fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93/kube-multus/0.log" Oct 03 18:16:13 crc kubenswrapper[4835]: I1003 18:16:13.366104 4835 generic.go:334] "Generic (PLEG): container finished" podID="fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93" containerID="12ccf52445e391368af99975592bd8f1206e9a136c9bc04732839082fcaecde1" exitCode=1 Oct 03 18:16:13 crc kubenswrapper[4835]: I1003 18:16:13.366157 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8p9cd" event={"ID":"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93","Type":"ContainerDied","Data":"12ccf52445e391368af99975592bd8f1206e9a136c9bc04732839082fcaecde1"} Oct 03 18:16:13 crc kubenswrapper[4835]: I1003 18:16:13.366215 4835 scope.go:117] "RemoveContainer" containerID="d5ed4ea1c97965f4b350302c68387663665178447eba7a5a2d336ebbc6d11a33" Oct 03 18:16:13 crc kubenswrapper[4835]: I1003 18:16:13.366574 4835 scope.go:117] "RemoveContainer" containerID="12ccf52445e391368af99975592bd8f1206e9a136c9bc04732839082fcaecde1" Oct 03 18:16:13 crc kubenswrapper[4835]: E1003 18:16:13.366740 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8p9cd_openshift-multus(fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93)\"" pod="openshift-multus/multus-8p9cd" podUID="fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93" Oct 03 18:16:13 crc kubenswrapper[4835]: I1003 18:16:13.399978 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sxgvd" podStartSLOduration=95.399951585 podStartE2EDuration="1m35.399951585s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:15:56.328489333 +0000 UTC m=+98.044430195" watchObservedRunningTime="2025-10-03 18:16:13.399951585 +0000 UTC m=+115.115892497" Oct 03 18:16:13 crc kubenswrapper[4835]: I1003 18:16:13.876523 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:13 crc kubenswrapper[4835]: E1003 18:16:13.877308 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:13 crc kubenswrapper[4835]: I1003 18:16:13.876561 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:13 crc kubenswrapper[4835]: E1003 18:16:13.877586 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:14 crc kubenswrapper[4835]: I1003 18:16:14.370398 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8p9cd_fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93/kube-multus/1.log" Oct 03 18:16:14 crc kubenswrapper[4835]: I1003 18:16:14.876328 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:14 crc kubenswrapper[4835]: I1003 18:16:14.876458 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:14 crc kubenswrapper[4835]: E1003 18:16:14.876553 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:14 crc kubenswrapper[4835]: E1003 18:16:14.876814 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:15 crc kubenswrapper[4835]: I1003 18:16:15.876177 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:15 crc kubenswrapper[4835]: E1003 18:16:15.876271 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:15 crc kubenswrapper[4835]: I1003 18:16:15.876177 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:15 crc kubenswrapper[4835]: E1003 18:16:15.876328 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:16 crc kubenswrapper[4835]: I1003 18:16:16.875832 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:16 crc kubenswrapper[4835]: I1003 18:16:16.875844 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:16 crc kubenswrapper[4835]: E1003 18:16:16.876041 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:16 crc kubenswrapper[4835]: E1003 18:16:16.876189 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:17 crc kubenswrapper[4835]: I1003 18:16:17.876622 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:17 crc kubenswrapper[4835]: I1003 18:16:17.876654 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:17 crc kubenswrapper[4835]: E1003 18:16:17.876733 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:17 crc kubenswrapper[4835]: E1003 18:16:17.876879 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:18 crc kubenswrapper[4835]: E1003 18:16:18.825652 4835 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 03 18:16:18 crc kubenswrapper[4835]: I1003 18:16:18.876364 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:18 crc kubenswrapper[4835]: I1003 18:16:18.876535 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:18 crc kubenswrapper[4835]: E1003 18:16:18.876628 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:18 crc kubenswrapper[4835]: E1003 18:16:18.876822 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:18 crc kubenswrapper[4835]: E1003 18:16:18.975137 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 18:16:19 crc kubenswrapper[4835]: I1003 18:16:19.876501 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:19 crc kubenswrapper[4835]: I1003 18:16:19.876561 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:19 crc kubenswrapper[4835]: E1003 18:16:19.876653 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:19 crc kubenswrapper[4835]: E1003 18:16:19.876820 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:20 crc kubenswrapper[4835]: I1003 18:16:20.875956 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:20 crc kubenswrapper[4835]: I1003 18:16:20.876057 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:20 crc kubenswrapper[4835]: E1003 18:16:20.876161 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:20 crc kubenswrapper[4835]: E1003 18:16:20.876236 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:21 crc kubenswrapper[4835]: I1003 18:16:21.876766 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:21 crc kubenswrapper[4835]: I1003 18:16:21.876843 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:21 crc kubenswrapper[4835]: E1003 18:16:21.876912 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:21 crc kubenswrapper[4835]: E1003 18:16:21.877038 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:22 crc kubenswrapper[4835]: I1003 18:16:22.875915 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:22 crc kubenswrapper[4835]: E1003 18:16:22.876058 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:22 crc kubenswrapper[4835]: I1003 18:16:22.876167 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:22 crc kubenswrapper[4835]: E1003 18:16:22.876383 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:23 crc kubenswrapper[4835]: I1003 18:16:23.876769 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:23 crc kubenswrapper[4835]: I1003 18:16:23.876825 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:23 crc kubenswrapper[4835]: I1003 18:16:23.877142 4835 scope.go:117] "RemoveContainer" containerID="12ccf52445e391368af99975592bd8f1206e9a136c9bc04732839082fcaecde1" Oct 03 18:16:23 crc kubenswrapper[4835]: E1003 18:16:23.877430 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:23 crc kubenswrapper[4835]: E1003 18:16:23.877319 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:23 crc kubenswrapper[4835]: I1003 18:16:23.877523 4835 scope.go:117] "RemoveContainer" containerID="9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0" Oct 03 18:16:23 crc kubenswrapper[4835]: E1003 18:16:23.977191 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 18:16:24 crc kubenswrapper[4835]: I1003 18:16:24.402276 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/3.log" Oct 03 18:16:24 crc kubenswrapper[4835]: I1003 18:16:24.405182 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerStarted","Data":"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a"} Oct 03 18:16:24 crc kubenswrapper[4835]: I1003 18:16:24.405719 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:16:24 crc kubenswrapper[4835]: I1003 18:16:24.407276 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8p9cd_fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93/kube-multus/1.log" Oct 03 18:16:24 crc kubenswrapper[4835]: I1003 18:16:24.407322 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8p9cd" event={"ID":"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93","Type":"ContainerStarted","Data":"c4dfc32a4cce452f819127ad7835b9e48ebe8c563def12944d355e0868bed268"} Oct 03 18:16:24 crc kubenswrapper[4835]: I1003 18:16:24.430990 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podStartSLOduration=106.430928069 podStartE2EDuration="1m46.430928069s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:24.429764528 +0000 UTC m=+126.145705420" watchObservedRunningTime="2025-10-03 18:16:24.430928069 +0000 UTC m=+126.146868941" Oct 03 18:16:24 crc kubenswrapper[4835]: I1003 18:16:24.675636 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vlmkl"] Oct 03 18:16:24 crc kubenswrapper[4835]: I1003 18:16:24.675745 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:24 crc kubenswrapper[4835]: E1003 18:16:24.675826 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:24 crc kubenswrapper[4835]: I1003 18:16:24.876114 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:24 crc kubenswrapper[4835]: E1003 18:16:24.876519 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:25 crc kubenswrapper[4835]: I1003 18:16:25.876683 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:25 crc kubenswrapper[4835]: I1003 18:16:25.876782 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:25 crc kubenswrapper[4835]: I1003 18:16:25.876844 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:25 crc kubenswrapper[4835]: E1003 18:16:25.876895 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:25 crc kubenswrapper[4835]: E1003 18:16:25.877021 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:25 crc kubenswrapper[4835]: E1003 18:16:25.877122 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:26 crc kubenswrapper[4835]: I1003 18:16:26.877016 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:26 crc kubenswrapper[4835]: E1003 18:16:26.877176 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:27 crc kubenswrapper[4835]: I1003 18:16:27.876440 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:27 crc kubenswrapper[4835]: I1003 18:16:27.876470 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:27 crc kubenswrapper[4835]: I1003 18:16:27.876486 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:27 crc kubenswrapper[4835]: E1003 18:16:27.876555 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vlmkl" podUID="e2705556-f411-476d-9d8a-78543bae8dc7" Oct 03 18:16:27 crc kubenswrapper[4835]: E1003 18:16:27.876610 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 18:16:27 crc kubenswrapper[4835]: E1003 18:16:27.876702 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 18:16:28 crc kubenswrapper[4835]: I1003 18:16:28.876388 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:28 crc kubenswrapper[4835]: E1003 18:16:28.878556 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 18:16:29 crc kubenswrapper[4835]: I1003 18:16:29.876448 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:29 crc kubenswrapper[4835]: I1003 18:16:29.876445 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:29 crc kubenswrapper[4835]: I1003 18:16:29.876469 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:16:29 crc kubenswrapper[4835]: I1003 18:16:29.877934 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 03 18:16:29 crc kubenswrapper[4835]: I1003 18:16:29.878404 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 03 18:16:29 crc kubenswrapper[4835]: I1003 18:16:29.878583 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 03 18:16:29 crc kubenswrapper[4835]: I1003 18:16:29.878841 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 03 18:16:29 crc kubenswrapper[4835]: I1003 18:16:29.878898 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 03 18:16:29 crc kubenswrapper[4835]: I1003 18:16:29.879156 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 03 18:16:30 crc kubenswrapper[4835]: I1003 18:16:30.876612 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:35 crc kubenswrapper[4835]: I1003 18:16:35.040856 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.438845 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.475584 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-v7t7j"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.476028 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.476411 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.476453 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tz58z"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.476821 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.477684 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.478487 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.478806 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.480292 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.480464 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.480645 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.480693 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.480806 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.480873 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.480949 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.480961 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.480887 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.481045 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.481445 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.484179 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.484303 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.484388 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.484436 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.484506 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.484550 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.484659 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.484788 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.484976 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.487661 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.489812 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.491297 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.491691 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.492009 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.492645 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.494523 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-8nwfg"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.494755 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4tmrr"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.494967 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.495236 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.505233 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nh4lt"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.505816 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.505826 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.505866 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.505817 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.506130 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.506495 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.506616 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.506685 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.506894 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.508633 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.508850 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.508963 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.509082 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.509227 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.509382 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.509524 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.509644 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.509743 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.509806 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.509945 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.510180 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.509766 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.510291 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.510489 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.510323 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.511552 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.512301 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mcjr4"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.512750 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mcjr4" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.512807 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.513631 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v7gmf"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.514147 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p7wfs"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.515211 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.515489 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.517473 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.524505 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.541514 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.542398 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543001 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543239 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543332 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543402 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543457 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543517 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543592 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543665 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543716 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.544411 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543728 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543736 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.545265 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543770 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543865 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543890 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543931 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.543955 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.544887 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.544952 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.547458 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.547598 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.547702 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.547761 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.549208 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.551201 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-gnrjg"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.552772 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.551819 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.559260 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pglch"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.559385 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.559720 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.560032 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.561147 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.561820 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.562024 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.562149 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.562186 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.562310 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.562343 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.562516 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.562559 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.562629 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.562637 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.562520 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.562861 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.562929 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.563033 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.569046 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.570058 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-prk5w"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.570700 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prk5w" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.571334 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.571688 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.572710 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.572868 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.572970 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.573082 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.573088 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.573188 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.573296 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.573310 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.573611 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.573729 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.575959 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577133 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/76f0bfed-fea0-4d28-bb03-2d3b0ae79d92-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-v7t7j\" (UID: \"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577162 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/13a81cd4-6b06-4a2b-a2c2-4a0778518313-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bkfmz\" (UID: \"13a81cd4-6b06-4a2b-a2c2-4a0778518313\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577185 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/76f0bfed-fea0-4d28-bb03-2d3b0ae79d92-images\") pod \"machine-api-operator-5694c8668f-v7t7j\" (UID: \"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577207 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f0bfed-fea0-4d28-bb03-2d3b0ae79d92-config\") pod \"machine-api-operator-5694c8668f-v7t7j\" (UID: \"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577232 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/160f473b-3942-4848-affa-bb555f0068bc-trusted-ca\") pod \"console-operator-58897d9998-tz58z\" (UID: \"160f473b-3942-4848-affa-bb555f0068bc\") " pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577251 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b59d837-ca72-447d-8b77-42675b0ec49b-serving-cert\") pod \"route-controller-manager-6576b87f9c-mr45m\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577270 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160f473b-3942-4848-affa-bb555f0068bc-config\") pod \"console-operator-58897d9998-tz58z\" (UID: \"160f473b-3942-4848-affa-bb555f0068bc\") " pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577295 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-527px\" (UniqueName: \"kubernetes.io/projected/76f0bfed-fea0-4d28-bb03-2d3b0ae79d92-kube-api-access-527px\") pod \"machine-api-operator-5694c8668f-v7t7j\" (UID: \"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577318 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c9rb\" (UniqueName: \"kubernetes.io/projected/160f473b-3942-4848-affa-bb555f0068bc-kube-api-access-5c9rb\") pod \"console-operator-58897d9998-tz58z\" (UID: \"160f473b-3942-4848-affa-bb555f0068bc\") " pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577348 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b59d837-ca72-447d-8b77-42675b0ec49b-config\") pod \"route-controller-manager-6576b87f9c-mr45m\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577364 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc8c9\" (UniqueName: \"kubernetes.io/projected/6b59d837-ca72-447d-8b77-42675b0ec49b-kube-api-access-tc8c9\") pod \"route-controller-manager-6576b87f9c-mr45m\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577380 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpzlv\" (UniqueName: \"kubernetes.io/projected/13a81cd4-6b06-4a2b-a2c2-4a0778518313-kube-api-access-mpzlv\") pod \"cluster-samples-operator-665b6dd947-bkfmz\" (UID: \"13a81cd4-6b06-4a2b-a2c2-4a0778518313\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577393 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b59d837-ca72-447d-8b77-42675b0ec49b-client-ca\") pod \"route-controller-manager-6576b87f9c-mr45m\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.577416 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/160f473b-3942-4848-affa-bb555f0068bc-serving-cert\") pod \"console-operator-58897d9998-tz58z\" (UID: \"160f473b-3942-4848-affa-bb555f0068bc\") " pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.578659 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.583331 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.585596 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.583585 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.584011 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qgmg8"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.586622 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qgmg8" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.585489 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.587296 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.601862 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.636613 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.637136 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.637401 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.637942 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.639707 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.642912 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.643124 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.647439 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.655123 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.655622 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fqtzs"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.655965 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.656378 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.656840 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.656987 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fqtzs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.659003 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.659452 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.661823 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.665753 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqldm"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.666316 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.666629 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.666868 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.666977 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.667620 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.671121 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-4fnq6"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.672109 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.673886 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pf9vb"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.674748 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.674877 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.675210 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.675675 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.675769 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.675798 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.676218 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.676423 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-677dr"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.677033 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-677dr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.677396 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-v7t7j"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678295 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56ad1b37-3737-409f-a332-2676129348b6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttggp\" (UID: \"56ad1b37-3737-409f-a332-2676129348b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678329 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/76f0bfed-fea0-4d28-bb03-2d3b0ae79d92-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-v7t7j\" (UID: \"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678349 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqhvg\" (UniqueName: \"kubernetes.io/projected/3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8-kube-api-access-hqhvg\") pod \"openshift-apiserver-operator-796bbdcf4f-dvkvj\" (UID: \"3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678369 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqvtp\" (UniqueName: \"kubernetes.io/projected/16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7-kube-api-access-lqvtp\") pod \"dns-operator-744455d44c-qgmg8\" (UID: \"16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7\") " pod="openshift-dns-operator/dns-operator-744455d44c-qgmg8" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678384 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56ad1b37-3737-409f-a332-2676129348b6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttggp\" (UID: \"56ad1b37-3737-409f-a332-2676129348b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678431 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4p8g\" (UniqueName: \"kubernetes.io/projected/96ad30f4-8507-4530-af53-06c628b1388e-kube-api-access-d4p8g\") pod \"openshift-controller-manager-operator-756b6f6bc6-lbkjb\" (UID: \"96ad30f4-8507-4530-af53-06c628b1388e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678449 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24baba3b-d1f1-426a-88a9-9bd5cb44112d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mc5r2\" (UID: \"24baba3b-d1f1-426a-88a9-9bd5cb44112d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678464 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678501 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjsxz\" (UniqueName: \"kubernetes.io/projected/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-kube-api-access-fjsxz\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678519 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/13a81cd4-6b06-4a2b-a2c2-4a0778518313-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bkfmz\" (UID: \"13a81cd4-6b06-4a2b-a2c2-4a0778518313\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678533 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-config\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678549 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8jkr\" (UniqueName: \"kubernetes.io/projected/56bc4f0f-9acd-4179-be10-9ad383cbf689-kube-api-access-k8jkr\") pod \"openshift-config-operator-7777fb866f-q6jkk\" (UID: \"56bc4f0f-9acd-4179-be10-9ad383cbf689\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678564 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-audit-policies\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678634 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7807d4de-42f2-489a-af4d-0317ebbc154c-serving-cert\") pod \"service-ca-operator-777779d784-pglch\" (UID: \"7807d4de-42f2-489a-af4d-0317ebbc154c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678656 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24baba3b-d1f1-426a-88a9-9bd5cb44112d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mc5r2\" (UID: \"24baba3b-d1f1-426a-88a9-9bd5cb44112d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678674 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f0bfed-fea0-4d28-bb03-2d3b0ae79d92-config\") pod \"machine-api-operator-5694c8668f-v7t7j\" (UID: \"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678691 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/76f0bfed-fea0-4d28-bb03-2d3b0ae79d92-images\") pod \"machine-api-operator-5694c8668f-v7t7j\" (UID: \"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678709 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-serving-cert\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678724 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-audit-policies\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678740 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f3ea7745-8aa0-4bcd-86e4-326313d026bd-node-pullsecrets\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678755 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3ea7745-8aa0-4bcd-86e4-326313d026bd-serving-cert\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678795 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56bc4f0f-9acd-4179-be10-9ad383cbf689-serving-cert\") pod \"openshift-config-operator-7777fb866f-q6jkk\" (UID: \"56bc4f0f-9acd-4179-be10-9ad383cbf689\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678812 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-etcd-client\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678879 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678898 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f3ea7745-8aa0-4bcd-86e4-326313d026bd-etcd-client\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678912 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-etcd-serving-ca\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678928 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2cc40a88-90de-40be-b285-4b7f8bd11709-etcd-client\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678960 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.678975 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0eca3a7-03cd-4126-83d7-b15db1a7232f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679010 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-service-ca\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679025 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-config\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679567 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-oauth-serving-cert\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679616 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc1c2c4-ec18-46df-b876-157c08bbde36-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xjd4f\" (UID: \"1bc1c2c4-ec18-46df-b876-157c08bbde36\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679634 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679650 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24baba3b-d1f1-426a-88a9-9bd5cb44112d-config\") pod \"kube-controller-manager-operator-78b949d7b-mc5r2\" (UID: \"24baba3b-d1f1-426a-88a9-9bd5cb44112d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679678 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-image-import-ca\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679693 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phjd9\" (UniqueName: \"kubernetes.io/projected/f0eca3a7-03cd-4126-83d7-b15db1a7232f-kube-api-access-phjd9\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679719 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/160f473b-3942-4848-affa-bb555f0068bc-trusted-ca\") pod \"console-operator-58897d9998-tz58z\" (UID: \"160f473b-3942-4848-affa-bb555f0068bc\") " pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679736 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-oauth-config\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679751 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-244rv\" (UniqueName: \"kubernetes.io/projected/476e65c8-6293-4edf-b6a6-197763d8f7e1-kube-api-access-244rv\") pod \"machine-approver-56656f9798-95g8t\" (UID: \"476e65c8-6293-4edf-b6a6-197763d8f7e1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679765 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679783 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b59d837-ca72-447d-8b77-42675b0ec49b-serving-cert\") pod \"route-controller-manager-6576b87f9c-mr45m\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679798 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lwndq\" (UID: \"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679827 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dvkvj\" (UID: \"3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679843 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96ad30f4-8507-4530-af53-06c628b1388e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lbkjb\" (UID: \"96ad30f4-8507-4530-af53-06c628b1388e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679858 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvjt4\" (UniqueName: \"kubernetes.io/projected/b2b48efb-bafd-45af-b56e-4568f2416af8-kube-api-access-mvjt4\") pod \"migrator-59844c95c7-prk5w\" (UID: \"b2b48efb-bafd-45af-b56e-4568f2416af8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prk5w" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679874 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96ad30f4-8507-4530-af53-06c628b1388e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lbkjb\" (UID: \"96ad30f4-8507-4530-af53-06c628b1388e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679890 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lwndq\" (UID: \"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679906 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679919 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7807d4de-42f2-489a-af4d-0317ebbc154c-config\") pod \"service-ca-operator-777779d784-pglch\" (UID: \"7807d4de-42f2-489a-af4d-0317ebbc154c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679935 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-config\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679949 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/476e65c8-6293-4edf-b6a6-197763d8f7e1-config\") pod \"machine-approver-56656f9798-95g8t\" (UID: \"476e65c8-6293-4edf-b6a6-197763d8f7e1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679963 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679978 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6pq5\" (UniqueName: \"kubernetes.io/projected/efa22703-0a66-407f-9f2e-333bac190ce8-kube-api-access-k6pq5\") pod \"downloads-7954f5f757-mcjr4\" (UID: \"efa22703-0a66-407f-9f2e-333bac190ce8\") " pod="openshift-console/downloads-7954f5f757-mcjr4" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.679992 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-trusted-ca-bundle\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680006 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-serving-cert\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680020 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680033 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7-metrics-tls\") pod \"dns-operator-744455d44c-qgmg8\" (UID: \"16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7\") " pod="openshift-dns-operator/dns-operator-744455d44c-qgmg8" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680048 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0eca3a7-03cd-4126-83d7-b15db1a7232f-service-ca-bundle\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680063 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160f473b-3942-4848-affa-bb555f0068bc-config\") pod \"console-operator-58897d9998-tz58z\" (UID: \"160f473b-3942-4848-affa-bb555f0068bc\") " pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680094 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2cc40a88-90de-40be-b285-4b7f8bd11709-etcd-ca\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680133 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-encryption-config\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680147 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-audit-dir\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680163 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/56bc4f0f-9acd-4179-be10-9ad383cbf689-available-featuregates\") pod \"openshift-config-operator-7777fb866f-q6jkk\" (UID: \"56bc4f0f-9acd-4179-be10-9ad383cbf689\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680180 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2cc40a88-90de-40be-b285-4b7f8bd11709-etcd-service-ca\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680194 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680240 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqsdt\" (UniqueName: \"kubernetes.io/projected/1bc1c2c4-ec18-46df-b876-157c08bbde36-kube-api-access-zqsdt\") pod \"kube-storage-version-migrator-operator-b67b599dd-xjd4f\" (UID: \"1bc1c2c4-ec18-46df-b876-157c08bbde36\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680256 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/476e65c8-6293-4edf-b6a6-197763d8f7e1-auth-proxy-config\") pod \"machine-approver-56656f9798-95g8t\" (UID: \"476e65c8-6293-4edf-b6a6-197763d8f7e1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680345 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680111 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680529 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680541 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.680345 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scxdk\" (UniqueName: \"kubernetes.io/projected/2cc40a88-90de-40be-b285-4b7f8bd11709-kube-api-access-scxdk\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681469 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681503 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681526 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c9rb\" (UniqueName: \"kubernetes.io/projected/160f473b-3942-4848-affa-bb555f0068bc-kube-api-access-5c9rb\") pod \"console-operator-58897d9998-tz58z\" (UID: \"160f473b-3942-4848-affa-bb555f0068bc\") " pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681546 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-serving-cert\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681563 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9nwp\" (UniqueName: \"kubernetes.io/projected/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-kube-api-access-r9nwp\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681580 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc40a88-90de-40be-b285-4b7f8bd11709-config\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681599 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-527px\" (UniqueName: \"kubernetes.io/projected/76f0bfed-fea0-4d28-bb03-2d3b0ae79d92-kube-api-access-527px\") pod \"machine-api-operator-5694c8668f-v7t7j\" (UID: \"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681708 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ad1b37-3737-409f-a332-2676129348b6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttggp\" (UID: \"56ad1b37-3737-409f-a332-2676129348b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681720 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tz58z"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681735 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3ea7745-8aa0-4bcd-86e4-326313d026bd-audit-dir\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681753 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5vnb\" (UniqueName: \"kubernetes.io/projected/f3ea7745-8aa0-4bcd-86e4-326313d026bd-kube-api-access-m5vnb\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681814 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b59d837-ca72-447d-8b77-42675b0ec49b-config\") pod \"route-controller-manager-6576b87f9c-mr45m\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681818 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f0bfed-fea0-4d28-bb03-2d3b0ae79d92-config\") pod \"machine-api-operator-5694c8668f-v7t7j\" (UID: \"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681832 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc8c9\" (UniqueName: \"kubernetes.io/projected/6b59d837-ca72-447d-8b77-42675b0ec49b-kube-api-access-tc8c9\") pod \"route-controller-manager-6576b87f9c-mr45m\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681850 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0eca3a7-03cd-4126-83d7-b15db1a7232f-serving-cert\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681884 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.681967 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpzlv\" (UniqueName: \"kubernetes.io/projected/13a81cd4-6b06-4a2b-a2c2-4a0778518313-kube-api-access-mpzlv\") pod \"cluster-samples-operator-665b6dd947-bkfmz\" (UID: \"13a81cd4-6b06-4a2b-a2c2-4a0778518313\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.682103 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b59d837-ca72-447d-8b77-42675b0ec49b-client-ca\") pod \"route-controller-manager-6576b87f9c-mr45m\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.682182 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160f473b-3942-4848-affa-bb555f0068bc-config\") pod \"console-operator-58897d9998-tz58z\" (UID: \"160f473b-3942-4848-affa-bb555f0068bc\") " pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.682208 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lwndq\" (UID: \"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.682901 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/76f0bfed-fea0-4d28-bb03-2d3b0ae79d92-images\") pod \"machine-api-operator-5694c8668f-v7t7j\" (UID: \"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.682923 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.682868 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b59d837-ca72-447d-8b77-42675b0ec49b-client-ca\") pod \"route-controller-manager-6576b87f9c-mr45m\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.682952 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/476e65c8-6293-4edf-b6a6-197763d8f7e1-machine-approver-tls\") pod \"machine-approver-56656f9798-95g8t\" (UID: \"476e65c8-6293-4edf-b6a6-197763d8f7e1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.682780 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b59d837-ca72-447d-8b77-42675b0ec49b-config\") pod \"route-controller-manager-6576b87f9c-mr45m\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.682455 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/160f473b-3942-4848-affa-bb555f0068bc-trusted-ca\") pod \"console-operator-58897d9998-tz58z\" (UID: \"160f473b-3942-4848-affa-bb555f0068bc\") " pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.682982 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f3ea7745-8aa0-4bcd-86e4-326313d026bd-encryption-config\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683000 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0eca3a7-03cd-4126-83d7-b15db1a7232f-config\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.682886 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4tmrr"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683027 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc40a88-90de-40be-b285-4b7f8bd11709-serving-cert\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683061 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683107 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fxnq\" (UniqueName: \"kubernetes.io/projected/6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47-kube-api-access-9fxnq\") pod \"cluster-image-registry-operator-dc59b4c8b-lwndq\" (UID: \"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683126 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dvkvj\" (UID: \"3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683145 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr2n7\" (UniqueName: \"kubernetes.io/projected/810a8fd3-d63d-4fd1-b6f1-186457e8878a-kube-api-access-zr2n7\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683160 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp8xm\" (UniqueName: \"kubernetes.io/projected/7807d4de-42f2-489a-af4d-0317ebbc154c-kube-api-access-sp8xm\") pod \"service-ca-operator-777779d784-pglch\" (UID: \"7807d4de-42f2-489a-af4d-0317ebbc154c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683175 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-audit\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683192 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/160f473b-3942-4848-affa-bb555f0068bc-serving-cert\") pod \"console-operator-58897d9998-tz58z\" (UID: \"160f473b-3942-4848-affa-bb555f0068bc\") " pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683209 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bc1c2c4-ec18-46df-b876-157c08bbde36-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xjd4f\" (UID: \"1bc1c2c4-ec18-46df-b876-157c08bbde36\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683225 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-client-ca\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683241 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/810a8fd3-d63d-4fd1-b6f1-186457e8878a-audit-dir\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683257 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683282 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whqgr\" (UniqueName: \"kubernetes.io/projected/300d2397-b9b1-4f44-9eb2-5757940cc64c-kube-api-access-whqgr\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.683809 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lf89q"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.684499 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lf89q" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.684775 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b59d837-ca72-447d-8b77-42675b0ec49b-serving-cert\") pod \"route-controller-manager-6576b87f9c-mr45m\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.685414 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/76f0bfed-fea0-4d28-bb03-2d3b0ae79d92-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-v7t7j\" (UID: \"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.685821 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/13a81cd4-6b06-4a2b-a2c2-4a0778518313-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bkfmz\" (UID: \"13a81cd4-6b06-4a2b-a2c2-4a0778518313\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.693140 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8nwfg"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.696914 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nh4lt"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.696945 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mcjr4"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.696955 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.698164 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.698721 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/160f473b-3942-4848-affa-bb555f0068bc-serving-cert\") pod \"console-operator-58897d9998-tz58z\" (UID: \"160f473b-3942-4848-affa-bb555f0068bc\") " pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.700825 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.704877 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.707020 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-prk5w"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.708122 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v7gmf"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.709262 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p7wfs"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.717807 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.717848 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.719169 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.721652 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.721960 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.723208 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqldm"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.724144 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pf9vb"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.725123 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.726875 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qgmg8"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.729593 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fqtzs"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.729630 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-gnrjg"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.729640 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.731518 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.732130 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.733020 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pglch"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.734350 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.735531 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lf89q"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.737447 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.739122 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.739795 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.741204 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-lt54v"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.741995 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lt54v" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.742569 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-26hp9"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.743167 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-26hp9" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.743997 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.745101 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.746398 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-26hp9"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.747566 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-677dr"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.748549 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.750965 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bv6gz"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.751820 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.752011 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bv6gz"] Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.758713 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.778751 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783663 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqsdt\" (UniqueName: \"kubernetes.io/projected/1bc1c2c4-ec18-46df-b876-157c08bbde36-kube-api-access-zqsdt\") pod \"kube-storage-version-migrator-operator-b67b599dd-xjd4f\" (UID: \"1bc1c2c4-ec18-46df-b876-157c08bbde36\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783687 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/476e65c8-6293-4edf-b6a6-197763d8f7e1-auth-proxy-config\") pod \"machine-approver-56656f9798-95g8t\" (UID: \"476e65c8-6293-4edf-b6a6-197763d8f7e1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783704 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scxdk\" (UniqueName: \"kubernetes.io/projected/2cc40a88-90de-40be-b285-4b7f8bd11709-kube-api-access-scxdk\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783722 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783740 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783762 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ebdd246-dcb5-4785-b1a4-7133f6317e91-trusted-ca\") pod \"ingress-operator-5b745b69d9-dbvth\" (UID: \"8ebdd246-dcb5-4785-b1a4-7133f6317e91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783800 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-serving-cert\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783825 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9nwp\" (UniqueName: \"kubernetes.io/projected/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-kube-api-access-r9nwp\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783850 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc40a88-90de-40be-b285-4b7f8bd11709-config\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783884 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ad1b37-3737-409f-a332-2676129348b6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttggp\" (UID: \"56ad1b37-3737-409f-a332-2676129348b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783907 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3ea7745-8aa0-4bcd-86e4-326313d026bd-audit-dir\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783931 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5vnb\" (UniqueName: \"kubernetes.io/projected/f3ea7745-8aa0-4bcd-86e4-326313d026bd-kube-api-access-m5vnb\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783960 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0eca3a7-03cd-4126-83d7-b15db1a7232f-serving-cert\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.783983 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784002 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lwndq\" (UID: \"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784016 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784031 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/476e65c8-6293-4edf-b6a6-197763d8f7e1-machine-approver-tls\") pod \"machine-approver-56656f9798-95g8t\" (UID: \"476e65c8-6293-4edf-b6a6-197763d8f7e1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784045 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f3ea7745-8aa0-4bcd-86e4-326313d026bd-encryption-config\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784063 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0eca3a7-03cd-4126-83d7-b15db1a7232f-config\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784100 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc40a88-90de-40be-b285-4b7f8bd11709-serving-cert\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784100 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3ea7745-8aa0-4bcd-86e4-326313d026bd-audit-dir\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784118 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784160 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fxnq\" (UniqueName: \"kubernetes.io/projected/6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47-kube-api-access-9fxnq\") pod \"cluster-image-registry-operator-dc59b4c8b-lwndq\" (UID: \"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784181 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dvkvj\" (UID: \"3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784203 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr2n7\" (UniqueName: \"kubernetes.io/projected/810a8fd3-d63d-4fd1-b6f1-186457e8878a-kube-api-access-zr2n7\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784223 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp8xm\" (UniqueName: \"kubernetes.io/projected/7807d4de-42f2-489a-af4d-0317ebbc154c-kube-api-access-sp8xm\") pod \"service-ca-operator-777779d784-pglch\" (UID: \"7807d4de-42f2-489a-af4d-0317ebbc154c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784241 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-audit\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784269 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bc1c2c4-ec18-46df-b876-157c08bbde36-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xjd4f\" (UID: \"1bc1c2c4-ec18-46df-b876-157c08bbde36\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784287 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-client-ca\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784302 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/810a8fd3-d63d-4fd1-b6f1-186457e8878a-audit-dir\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784320 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784340 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whqgr\" (UniqueName: \"kubernetes.io/projected/300d2397-b9b1-4f44-9eb2-5757940cc64c-kube-api-access-whqgr\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784360 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32005ae4-91c7-48c7-a713-59738b849926-webhook-cert\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784384 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56ad1b37-3737-409f-a332-2676129348b6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttggp\" (UID: \"56ad1b37-3737-409f-a332-2676129348b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784384 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/476e65c8-6293-4edf-b6a6-197763d8f7e1-auth-proxy-config\") pod \"machine-approver-56656f9798-95g8t\" (UID: \"476e65c8-6293-4edf-b6a6-197763d8f7e1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784401 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ebdd246-dcb5-4785-b1a4-7133f6317e91-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dbvth\" (UID: \"8ebdd246-dcb5-4785-b1a4-7133f6317e91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784456 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqhvg\" (UniqueName: \"kubernetes.io/projected/3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8-kube-api-access-hqhvg\") pod \"openshift-apiserver-operator-796bbdcf4f-dvkvj\" (UID: \"3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784479 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqvtp\" (UniqueName: \"kubernetes.io/projected/16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7-kube-api-access-lqvtp\") pod \"dns-operator-744455d44c-qgmg8\" (UID: \"16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7\") " pod="openshift-dns-operator/dns-operator-744455d44c-qgmg8" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784499 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56ad1b37-3737-409f-a332-2676129348b6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttggp\" (UID: \"56ad1b37-3737-409f-a332-2676129348b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784515 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4p8g\" (UniqueName: \"kubernetes.io/projected/96ad30f4-8507-4530-af53-06c628b1388e-kube-api-access-d4p8g\") pod \"openshift-controller-manager-operator-756b6f6bc6-lbkjb\" (UID: \"96ad30f4-8507-4530-af53-06c628b1388e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784536 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24baba3b-d1f1-426a-88a9-9bd5cb44112d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mc5r2\" (UID: \"24baba3b-d1f1-426a-88a9-9bd5cb44112d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784557 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjsxz\" (UniqueName: \"kubernetes.io/projected/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-kube-api-access-fjsxz\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784575 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784594 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-config\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784617 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8jkr\" (UniqueName: \"kubernetes.io/projected/56bc4f0f-9acd-4179-be10-9ad383cbf689-kube-api-access-k8jkr\") pod \"openshift-config-operator-7777fb866f-q6jkk\" (UID: \"56bc4f0f-9acd-4179-be10-9ad383cbf689\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784633 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-audit-policies\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784649 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7807d4de-42f2-489a-af4d-0317ebbc154c-serving-cert\") pod \"service-ca-operator-777779d784-pglch\" (UID: \"7807d4de-42f2-489a-af4d-0317ebbc154c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784668 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24baba3b-d1f1-426a-88a9-9bd5cb44112d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mc5r2\" (UID: \"24baba3b-d1f1-426a-88a9-9bd5cb44112d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784689 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32005ae4-91c7-48c7-a713-59738b849926-apiservice-cert\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784706 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ebdd246-dcb5-4785-b1a4-7133f6317e91-metrics-tls\") pod \"ingress-operator-5b745b69d9-dbvth\" (UID: \"8ebdd246-dcb5-4785-b1a4-7133f6317e91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784736 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-serving-cert\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784753 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-audit-policies\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784768 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f3ea7745-8aa0-4bcd-86e4-326313d026bd-node-pullsecrets\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784800 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3ea7745-8aa0-4bcd-86e4-326313d026bd-serving-cert\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784819 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blr4k\" (UniqueName: \"kubernetes.io/projected/32005ae4-91c7-48c7-a713-59738b849926-kube-api-access-blr4k\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784837 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56bc4f0f-9acd-4179-be10-9ad383cbf689-serving-cert\") pod \"openshift-config-operator-7777fb866f-q6jkk\" (UID: \"56bc4f0f-9acd-4179-be10-9ad383cbf689\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784853 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5wvp\" (UniqueName: \"kubernetes.io/projected/8ebdd246-dcb5-4785-b1a4-7133f6317e91-kube-api-access-r5wvp\") pod \"ingress-operator-5b745b69d9-dbvth\" (UID: \"8ebdd246-dcb5-4785-b1a4-7133f6317e91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784872 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2cc40a88-90de-40be-b285-4b7f8bd11709-etcd-client\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784888 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-etcd-client\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784906 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784922 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f3ea7745-8aa0-4bcd-86e4-326313d026bd-etcd-client\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784936 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-etcd-serving-ca\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784955 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/32005ae4-91c7-48c7-a713-59738b849926-tmpfs\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784976 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-service-ca\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.784992 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785007 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0eca3a7-03cd-4126-83d7-b15db1a7232f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785028 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-config\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785047 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-oauth-serving-cert\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785063 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc1c2c4-ec18-46df-b876-157c08bbde36-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xjd4f\" (UID: \"1bc1c2c4-ec18-46df-b876-157c08bbde36\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785103 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785120 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24baba3b-d1f1-426a-88a9-9bd5cb44112d-config\") pod \"kube-controller-manager-operator-78b949d7b-mc5r2\" (UID: \"24baba3b-d1f1-426a-88a9-9bd5cb44112d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785135 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-image-import-ca\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785151 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phjd9\" (UniqueName: \"kubernetes.io/projected/f0eca3a7-03cd-4126-83d7-b15db1a7232f-kube-api-access-phjd9\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785172 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-oauth-config\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785187 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-244rv\" (UniqueName: \"kubernetes.io/projected/476e65c8-6293-4edf-b6a6-197763d8f7e1-kube-api-access-244rv\") pod \"machine-approver-56656f9798-95g8t\" (UID: \"476e65c8-6293-4edf-b6a6-197763d8f7e1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785203 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785226 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lwndq\" (UID: \"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785242 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dvkvj\" (UID: \"3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785262 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvjt4\" (UniqueName: \"kubernetes.io/projected/b2b48efb-bafd-45af-b56e-4568f2416af8-kube-api-access-mvjt4\") pod \"migrator-59844c95c7-prk5w\" (UID: \"b2b48efb-bafd-45af-b56e-4568f2416af8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prk5w" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785280 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96ad30f4-8507-4530-af53-06c628b1388e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lbkjb\" (UID: \"96ad30f4-8507-4530-af53-06c628b1388e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785296 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96ad30f4-8507-4530-af53-06c628b1388e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lbkjb\" (UID: \"96ad30f4-8507-4530-af53-06c628b1388e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785318 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lwndq\" (UID: \"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785334 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785350 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7807d4de-42f2-489a-af4d-0317ebbc154c-config\") pod \"service-ca-operator-777779d784-pglch\" (UID: \"7807d4de-42f2-489a-af4d-0317ebbc154c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785366 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-config\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785383 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/476e65c8-6293-4edf-b6a6-197763d8f7e1-config\") pod \"machine-approver-56656f9798-95g8t\" (UID: \"476e65c8-6293-4edf-b6a6-197763d8f7e1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785399 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785418 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6pq5\" (UniqueName: \"kubernetes.io/projected/efa22703-0a66-407f-9f2e-333bac190ce8-kube-api-access-k6pq5\") pod \"downloads-7954f5f757-mcjr4\" (UID: \"efa22703-0a66-407f-9f2e-333bac190ce8\") " pod="openshift-console/downloads-7954f5f757-mcjr4" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785436 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-trusted-ca-bundle\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785451 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-serving-cert\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785468 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785483 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7-metrics-tls\") pod \"dns-operator-744455d44c-qgmg8\" (UID: \"16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7\") " pod="openshift-dns-operator/dns-operator-744455d44c-qgmg8" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785498 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0eca3a7-03cd-4126-83d7-b15db1a7232f-service-ca-bundle\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785524 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2cc40a88-90de-40be-b285-4b7f8bd11709-etcd-ca\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785547 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-encryption-config\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785563 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-audit-dir\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785581 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/56bc4f0f-9acd-4179-be10-9ad383cbf689-available-featuregates\") pod \"openshift-config-operator-7777fb866f-q6jkk\" (UID: \"56bc4f0f-9acd-4179-be10-9ad383cbf689\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785597 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2cc40a88-90de-40be-b285-4b7f8bd11709-etcd-service-ca\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785613 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.785921 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.786063 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/810a8fd3-d63d-4fd1-b6f1-186457e8878a-audit-dir\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.786524 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-client-ca\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.786566 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-config\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.786740 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.787343 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-image-import-ca\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.787495 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-serving-cert\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.787697 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.788044 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.788207 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0eca3a7-03cd-4126-83d7-b15db1a7232f-config\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.788305 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0eca3a7-03cd-4126-83d7-b15db1a7232f-serving-cert\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.788511 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f3ea7745-8aa0-4bcd-86e4-326313d026bd-node-pullsecrets\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.788630 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-audit-policies\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.788790 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2cc40a88-90de-40be-b285-4b7f8bd11709-etcd-client\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.789229 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0eca3a7-03cd-4126-83d7-b15db1a7232f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.789330 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.789476 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc40a88-90de-40be-b285-4b7f8bd11709-config\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.789577 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-etcd-serving-ca\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.789619 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.789801 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-service-ca\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.789820 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-audit-policies\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.789852 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dvkvj\" (UID: \"3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.790057 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-audit-dir\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.790134 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.790287 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.790715 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0eca3a7-03cd-4126-83d7-b15db1a7232f-service-ca-bundle\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.790919 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-config\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.791200 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/56bc4f0f-9acd-4179-be10-9ad383cbf689-available-featuregates\") pod \"openshift-config-operator-7777fb866f-q6jkk\" (UID: \"56bc4f0f-9acd-4179-be10-9ad383cbf689\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.791306 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lwndq\" (UID: \"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.791421 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2cc40a88-90de-40be-b285-4b7f8bd11709-etcd-service-ca\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.791432 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96ad30f4-8507-4530-af53-06c628b1388e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lbkjb\" (UID: \"96ad30f4-8507-4530-af53-06c628b1388e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.791740 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/476e65c8-6293-4edf-b6a6-197763d8f7e1-config\") pod \"machine-approver-56656f9798-95g8t\" (UID: \"476e65c8-6293-4edf-b6a6-197763d8f7e1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.791753 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dvkvj\" (UID: \"3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.791927 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-oauth-serving-cert\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.791934 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-config\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.792310 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96ad30f4-8507-4530-af53-06c628b1388e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lbkjb\" (UID: \"96ad30f4-8507-4530-af53-06c628b1388e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.792321 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-oauth-config\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.792415 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3ea7745-8aa0-4bcd-86e4-326313d026bd-serving-cert\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.792482 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.792559 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.792611 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-trusted-ca-bundle\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.793416 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-etcd-client\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.793540 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.793638 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-encryption-config\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.793785 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc40a88-90de-40be-b285-4b7f8bd11709-serving-cert\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.793925 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-serving-cert\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.794212 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f3ea7745-8aa0-4bcd-86e4-326313d026bd-etcd-client\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.794261 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56bc4f0f-9acd-4179-be10-9ad383cbf689-serving-cert\") pod \"openshift-config-operator-7777fb866f-q6jkk\" (UID: \"56bc4f0f-9acd-4179-be10-9ad383cbf689\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.794874 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f3ea7745-8aa0-4bcd-86e4-326313d026bd-encryption-config\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.795805 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lwndq\" (UID: \"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.795906 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-serving-cert\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.796358 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.796473 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2cc40a88-90de-40be-b285-4b7f8bd11709-etcd-ca\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.796515 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f3ea7745-8aa0-4bcd-86e4-326313d026bd-audit\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.798462 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.799449 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.800060 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.806625 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/476e65c8-6293-4edf-b6a6-197763d8f7e1-machine-approver-tls\") pod \"machine-approver-56656f9798-95g8t\" (UID: \"476e65c8-6293-4edf-b6a6-197763d8f7e1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.807028 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.830662 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.838539 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.840462 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56ad1b37-3737-409f-a332-2676129348b6-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttggp\" (UID: \"56ad1b37-3737-409f-a332-2676129348b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.840741 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56ad1b37-3737-409f-a332-2676129348b6-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttggp\" (UID: \"56ad1b37-3737-409f-a332-2676129348b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.858306 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.879137 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.889080 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ebdd246-dcb5-4785-b1a4-7133f6317e91-trusted-ca\") pod \"ingress-operator-5b745b69d9-dbvth\" (UID: \"8ebdd246-dcb5-4785-b1a4-7133f6317e91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.889197 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32005ae4-91c7-48c7-a713-59738b849926-webhook-cert\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.889222 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ebdd246-dcb5-4785-b1a4-7133f6317e91-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dbvth\" (UID: \"8ebdd246-dcb5-4785-b1a4-7133f6317e91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.889290 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ebdd246-dcb5-4785-b1a4-7133f6317e91-metrics-tls\") pod \"ingress-operator-5b745b69d9-dbvth\" (UID: \"8ebdd246-dcb5-4785-b1a4-7133f6317e91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.889324 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32005ae4-91c7-48c7-a713-59738b849926-apiservice-cert\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.889351 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blr4k\" (UniqueName: \"kubernetes.io/projected/32005ae4-91c7-48c7-a713-59738b849926-kube-api-access-blr4k\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.889365 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5wvp\" (UniqueName: \"kubernetes.io/projected/8ebdd246-dcb5-4785-b1a4-7133f6317e91-kube-api-access-r5wvp\") pod \"ingress-operator-5b745b69d9-dbvth\" (UID: \"8ebdd246-dcb5-4785-b1a4-7133f6317e91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.889390 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/32005ae4-91c7-48c7-a713-59738b849926-tmpfs\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.889845 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/32005ae4-91c7-48c7-a713-59738b849926-tmpfs\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.898944 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.908967 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bc1c2c4-ec18-46df-b876-157c08bbde36-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xjd4f\" (UID: \"1bc1c2c4-ec18-46df-b876-157c08bbde36\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.920571 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.930664 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc1c2c4-ec18-46df-b876-157c08bbde36-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xjd4f\" (UID: \"1bc1c2c4-ec18-46df-b876-157c08bbde36\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.938502 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.958243 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.978936 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 03 18:16:36 crc kubenswrapper[4835]: I1003 18:16:36.999658 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.012614 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7807d4de-42f2-489a-af4d-0317ebbc154c-serving-cert\") pod \"service-ca-operator-777779d784-pglch\" (UID: \"7807d4de-42f2-489a-af4d-0317ebbc154c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.018866 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.039019 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.042366 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7807d4de-42f2-489a-af4d-0317ebbc154c-config\") pod \"service-ca-operator-777779d784-pglch\" (UID: \"7807d4de-42f2-489a-af4d-0317ebbc154c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.058244 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.078775 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.099133 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.119632 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.139334 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.140453 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24baba3b-d1f1-426a-88a9-9bd5cb44112d-config\") pod \"kube-controller-manager-operator-78b949d7b-mc5r2\" (UID: \"24baba3b-d1f1-426a-88a9-9bd5cb44112d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.159735 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.167600 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24baba3b-d1f1-426a-88a9-9bd5cb44112d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mc5r2\" (UID: \"24baba3b-d1f1-426a-88a9-9bd5cb44112d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.178172 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.219008 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.238962 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.258446 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.264107 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7-metrics-tls\") pod \"dns-operator-744455d44c-qgmg8\" (UID: \"16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7\") " pod="openshift-dns-operator/dns-operator-744455d44c-qgmg8" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.279515 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.298751 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.318909 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.338974 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.358728 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.378796 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.398623 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.418239 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.438945 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.458945 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.479798 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.492183 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ebdd246-dcb5-4785-b1a4-7133f6317e91-metrics-tls\") pod \"ingress-operator-5b745b69d9-dbvth\" (UID: \"8ebdd246-dcb5-4785-b1a4-7133f6317e91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.499166 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.519000 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.539145 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.567097 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.571091 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ebdd246-dcb5-4785-b1a4-7133f6317e91-trusted-ca\") pod \"ingress-operator-5b745b69d9-dbvth\" (UID: \"8ebdd246-dcb5-4785-b1a4-7133f6317e91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.579841 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.599967 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.619630 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.639463 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.659028 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.678200 4835 request.go:700] Waited for 1.012808762s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-admission-controller-secret&limit=500&resourceVersion=0 Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.679783 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.699159 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.719774 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.738435 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.758830 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.779205 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.799264 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.819484 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.839732 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.858576 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.879172 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 03 18:16:37 crc kubenswrapper[4835]: E1003 18:16:37.889773 4835 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 03 18:16:37 crc kubenswrapper[4835]: E1003 18:16:37.889799 4835 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 03 18:16:37 crc kubenswrapper[4835]: E1003 18:16:37.889836 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32005ae4-91c7-48c7-a713-59738b849926-apiservice-cert podName:32005ae4-91c7-48c7-a713-59738b849926 nodeName:}" failed. No retries permitted until 2025-10-03 18:16:38.38981839 +0000 UTC m=+140.105759262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/32005ae4-91c7-48c7-a713-59738b849926-apiservice-cert") pod "packageserver-d55dfcdfc-jxp2s" (UID: "32005ae4-91c7-48c7-a713-59738b849926") : failed to sync secret cache: timed out waiting for the condition Oct 03 18:16:37 crc kubenswrapper[4835]: E1003 18:16:37.889849 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32005ae4-91c7-48c7-a713-59738b849926-webhook-cert podName:32005ae4-91c7-48c7-a713-59738b849926 nodeName:}" failed. No retries permitted until 2025-10-03 18:16:38.3898432 +0000 UTC m=+140.105784072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/32005ae4-91c7-48c7-a713-59738b849926-webhook-cert") pod "packageserver-d55dfcdfc-jxp2s" (UID: "32005ae4-91c7-48c7-a713-59738b849926") : failed to sync secret cache: timed out waiting for the condition Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.898039 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.918421 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.938821 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.959446 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 03 18:16:37 crc kubenswrapper[4835]: I1003 18:16:37.979248 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.003968 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.018852 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.038226 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.060210 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.079977 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.098979 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.119042 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.139217 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.158841 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.179394 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.199941 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.219377 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.238727 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.282603 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c9rb\" (UniqueName: \"kubernetes.io/projected/160f473b-3942-4848-affa-bb555f0068bc-kube-api-access-5c9rb\") pod \"console-operator-58897d9998-tz58z\" (UID: \"160f473b-3942-4848-affa-bb555f0068bc\") " pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.301670 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-527px\" (UniqueName: \"kubernetes.io/projected/76f0bfed-fea0-4d28-bb03-2d3b0ae79d92-kube-api-access-527px\") pod \"machine-api-operator-5694c8668f-v7t7j\" (UID: \"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.313653 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.321816 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc8c9\" (UniqueName: \"kubernetes.io/projected/6b59d837-ca72-447d-8b77-42675b0ec49b-kube-api-access-tc8c9\") pod \"route-controller-manager-6576b87f9c-mr45m\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.329666 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.339297 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.339785 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpzlv\" (UniqueName: \"kubernetes.io/projected/13a81cd4-6b06-4a2b-a2c2-4a0778518313-kube-api-access-mpzlv\") pod \"cluster-samples-operator-665b6dd947-bkfmz\" (UID: \"13a81cd4-6b06-4a2b-a2c2-4a0778518313\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.352515 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.359300 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.381581 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.399153 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.408154 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32005ae4-91c7-48c7-a713-59738b849926-webhook-cert\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.408256 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32005ae4-91c7-48c7-a713-59738b849926-apiservice-cert\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.411810 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32005ae4-91c7-48c7-a713-59738b849926-webhook-cert\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.412139 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32005ae4-91c7-48c7-a713-59738b849926-apiservice-cert\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.439591 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.459354 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.479438 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.506625 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.519369 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.539471 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.556600 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz"] Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.559353 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.578763 4835 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.598451 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.599834 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.636143 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqsdt\" (UniqueName: \"kubernetes.io/projected/1bc1c2c4-ec18-46df-b876-157c08bbde36-kube-api-access-zqsdt\") pod \"kube-storage-version-migrator-operator-b67b599dd-xjd4f\" (UID: \"1bc1c2c4-ec18-46df-b876-157c08bbde36\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.653952 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scxdk\" (UniqueName: \"kubernetes.io/projected/2cc40a88-90de-40be-b285-4b7f8bd11709-kube-api-access-scxdk\") pod \"etcd-operator-b45778765-gnrjg\" (UID: \"2cc40a88-90de-40be-b285-4b7f8bd11709\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.677668 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5vnb\" (UniqueName: \"kubernetes.io/projected/f3ea7745-8aa0-4bcd-86e4-326313d026bd-kube-api-access-m5vnb\") pod \"apiserver-76f77b778f-nh4lt\" (UID: \"f3ea7745-8aa0-4bcd-86e4-326313d026bd\") " pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.692680 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8jkr\" (UniqueName: \"kubernetes.io/projected/56bc4f0f-9acd-4179-be10-9ad383cbf689-kube-api-access-k8jkr\") pod \"openshift-config-operator-7777fb866f-q6jkk\" (UID: \"56bc4f0f-9acd-4179-be10-9ad383cbf689\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.699755 4835 request.go:700] Waited for 1.914715855s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/serviceaccounts/openshift-controller-manager-operator/token Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.729289 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4p8g\" (UniqueName: \"kubernetes.io/projected/96ad30f4-8507-4530-af53-06c628b1388e-kube-api-access-d4p8g\") pod \"openshift-controller-manager-operator-756b6f6bc6-lbkjb\" (UID: \"96ad30f4-8507-4530-af53-06c628b1388e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.739349 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tz58z"] Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.740289 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-v7t7j"] Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.744719 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqhvg\" (UniqueName: \"kubernetes.io/projected/3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8-kube-api-access-hqhvg\") pod \"openshift-apiserver-operator-796bbdcf4f-dvkvj\" (UID: \"3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.756451 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqvtp\" (UniqueName: \"kubernetes.io/projected/16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7-kube-api-access-lqvtp\") pod \"dns-operator-744455d44c-qgmg8\" (UID: \"16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7\") " pod="openshift-dns-operator/dns-operator-744455d44c-qgmg8" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.756724 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.781457 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56ad1b37-3737-409f-a332-2676129348b6-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ttggp\" (UID: \"56ad1b37-3737-409f-a332-2676129348b6\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.793964 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjsxz\" (UniqueName: \"kubernetes.io/projected/6dbbcec6-d076-48dc-8d4d-93668ce8f2a1-kube-api-access-fjsxz\") pod \"apiserver-7bbb656c7d-n8dm7\" (UID: \"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.809407 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m"] Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.818521 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whqgr\" (UniqueName: \"kubernetes.io/projected/300d2397-b9b1-4f44-9eb2-5757940cc64c-kube-api-access-whqgr\") pod \"console-f9d7485db-8nwfg\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:38 crc kubenswrapper[4835]: W1003 18:16:38.829346 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b59d837_ca72_447d_8b77_42675b0ec49b.slice/crio-7577de618cd02bd4bf6639a18bd116fe42cab8d9dbac69609e75252193e0268e WatchSource:0}: Error finding container 7577de618cd02bd4bf6639a18bd116fe42cab8d9dbac69609e75252193e0268e: Status 404 returned error can't find the container with id 7577de618cd02bd4bf6639a18bd116fe42cab8d9dbac69609e75252193e0268e Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.836124 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lwndq\" (UID: \"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.850858 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.856537 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.859310 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.862441 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fxnq\" (UniqueName: \"kubernetes.io/projected/6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47-kube-api-access-9fxnq\") pod \"cluster-image-registry-operator-dc59b4c8b-lwndq\" (UID: \"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.868744 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.896376 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9nwp\" (UniqueName: \"kubernetes.io/projected/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-kube-api-access-r9nwp\") pod \"controller-manager-879f6c89f-4tmrr\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.899308 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr2n7\" (UniqueName: \"kubernetes.io/projected/810a8fd3-d63d-4fd1-b6f1-186457e8878a-kube-api-access-zr2n7\") pod \"oauth-openshift-558db77b4-p7wfs\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.912581 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp8xm\" (UniqueName: \"kubernetes.io/projected/7807d4de-42f2-489a-af4d-0317ebbc154c-kube-api-access-sp8xm\") pod \"service-ca-operator-777779d784-pglch\" (UID: \"7807d4de-42f2-489a-af4d-0317ebbc154c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.936813 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qgmg8" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.945453 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phjd9\" (UniqueName: \"kubernetes.io/projected/f0eca3a7-03cd-4126-83d7-b15db1a7232f-kube-api-access-phjd9\") pod \"authentication-operator-69f744f599-v7gmf\" (UID: \"f0eca3a7-03cd-4126-83d7-b15db1a7232f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.961843 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24baba3b-d1f1-426a-88a9-9bd5cb44112d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mc5r2\" (UID: \"24baba3b-d1f1-426a-88a9-9bd5cb44112d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.965154 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.972176 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.977200 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-244rv\" (UniqueName: \"kubernetes.io/projected/476e65c8-6293-4edf-b6a6-197763d8f7e1-kube-api-access-244rv\") pod \"machine-approver-56656f9798-95g8t\" (UID: \"476e65c8-6293-4edf-b6a6-197763d8f7e1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.983446 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nh4lt"] Oct 03 18:16:38 crc kubenswrapper[4835]: I1003 18:16:38.987857 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.001352 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvjt4\" (UniqueName: \"kubernetes.io/projected/b2b48efb-bafd-45af-b56e-4568f2416af8-kube-api-access-mvjt4\") pod \"migrator-59844c95c7-prk5w\" (UID: \"b2b48efb-bafd-45af-b56e-4568f2416af8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prk5w" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.022694 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6pq5\" (UniqueName: \"kubernetes.io/projected/efa22703-0a66-407f-9f2e-333bac190ce8-kube-api-access-k6pq5\") pod \"downloads-7954f5f757-mcjr4\" (UID: \"efa22703-0a66-407f-9f2e-333bac190ce8\") " pod="openshift-console/downloads-7954f5f757-mcjr4" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.036694 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ebdd246-dcb5-4785-b1a4-7133f6317e91-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dbvth\" (UID: \"8ebdd246-dcb5-4785-b1a4-7133f6317e91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.043052 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.053622 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blr4k\" (UniqueName: \"kubernetes.io/projected/32005ae4-91c7-48c7-a713-59738b849926-kube-api-access-blr4k\") pod \"packageserver-d55dfcdfc-jxp2s\" (UID: \"32005ae4-91c7-48c7-a713-59738b849926\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.066936 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mcjr4" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.073920 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.077135 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5wvp\" (UniqueName: \"kubernetes.io/projected/8ebdd246-dcb5-4785-b1a4-7133f6317e91-kube-api-access-r5wvp\") pod \"ingress-operator-5b745b69d9-dbvth\" (UID: \"8ebdd246-dcb5-4785-b1a4-7133f6317e91\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.081570 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.115866 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.124090 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.125926 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/28dda709-c837-4e67-90be-7acda1dd093a-signing-key\") pod \"service-ca-9c57cc56f-677dr\" (UID: \"28dda709-c837-4e67-90be-7acda1dd093a\") " pod="openshift-service-ca/service-ca-9c57cc56f-677dr" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.125951 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0545b5e8-2fe4-474f-aa8b-a00964fc6237-srv-cert\") pod \"catalog-operator-68c6474976-b5c8p\" (UID: \"0545b5e8-2fe4-474f-aa8b-a00964fc6237\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.125972 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-registry-certificates\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.125989 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v88q9\" (UniqueName: \"kubernetes.io/projected/59865c39-3a5f-44cf-b6a8-ce0552bc8d0b-kube-api-access-v88q9\") pod \"olm-operator-6b444d44fb-h6c75\" (UID: \"59865c39-3a5f-44cf-b6a8-ce0552bc8d0b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126006 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6818d850-0c23-481b-b3f5-fbb31275d97f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pf9vb\" (UID: \"6818d850-0c23-481b-b3f5-fbb31275d97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126021 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81aa4a3b-1c57-4883-b423-dc237393b801-images\") pod \"machine-config-operator-74547568cd-vg2h6\" (UID: \"81aa4a3b-1c57-4883-b423-dc237393b801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126046 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-default-certificate\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126121 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdlgj\" (UniqueName: \"kubernetes.io/projected/321f8504-187a-46a9-b5bc-a27b93175e39-kube-api-access-hdlgj\") pod \"package-server-manager-789f6589d5-b7btc\" (UID: \"321f8504-187a-46a9-b5bc-a27b93175e39\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126138 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-registry-tls\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126154 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-metrics-certs\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126172 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-config-volume\") pod \"collect-profiles-29325255-8n5s2\" (UID: \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126200 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkkh7\" (UniqueName: \"kubernetes.io/projected/078d8acd-04ed-453e-a67b-39efe6ea5bf9-kube-api-access-rkkh7\") pod \"multus-admission-controller-857f4d67dd-fqtzs\" (UID: \"078d8acd-04ed-453e-a67b-39efe6ea5bf9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fqtzs" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126215 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81aa4a3b-1c57-4883-b423-dc237393b801-proxy-tls\") pod \"machine-config-operator-74547568cd-vg2h6\" (UID: \"81aa4a3b-1c57-4883-b423-dc237393b801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126241 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltvm9\" (UniqueName: \"kubernetes.io/projected/0545b5e8-2fe4-474f-aa8b-a00964fc6237-kube-api-access-ltvm9\") pod \"catalog-operator-68c6474976-b5c8p\" (UID: \"0545b5e8-2fe4-474f-aa8b-a00964fc6237\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126268 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6-config\") pod \"kube-apiserver-operator-766d6c64bb-l27kl\" (UID: \"9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126285 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64724db7-de23-427c-a555-585e9c0b7173-cert\") pod \"ingress-canary-lf89q\" (UID: \"64724db7-de23-427c-a555-585e9c0b7173\") " pod="openshift-ingress-canary/ingress-canary-lf89q" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126324 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa5e9c31-c582-444e-97c2-9e285e2b75d4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5rj92\" (UID: \"aa5e9c31-c582-444e-97c2-9e285e2b75d4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126343 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxkhh\" (UniqueName: \"kubernetes.io/projected/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-kube-api-access-kxkhh\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126360 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnsm4\" (UniqueName: \"kubernetes.io/projected/64724db7-de23-427c-a555-585e9c0b7173-kube-api-access-vnsm4\") pod \"ingress-canary-lf89q\" (UID: \"64724db7-de23-427c-a555-585e9c0b7173\") " pod="openshift-ingress-canary/ingress-canary-lf89q" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126393 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-l27kl\" (UID: \"9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126433 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126476 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126491 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-secret-volume\") pod \"collect-profiles-29325255-8n5s2\" (UID: \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126519 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/078d8acd-04ed-453e-a67b-39efe6ea5bf9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fqtzs\" (UID: \"078d8acd-04ed-453e-a67b-39efe6ea5bf9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fqtzs" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126561 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-trusted-ca\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126578 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7pft\" (UniqueName: \"kubernetes.io/projected/28dda709-c837-4e67-90be-7acda1dd093a-kube-api-access-d7pft\") pod \"service-ca-9c57cc56f-677dr\" (UID: \"28dda709-c837-4e67-90be-7acda1dd093a\") " pod="openshift-service-ca/service-ca-9c57cc56f-677dr" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126594 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6818d850-0c23-481b-b3f5-fbb31275d97f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pf9vb\" (UID: \"6818d850-0c23-481b-b3f5-fbb31275d97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126610 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd7dm\" (UniqueName: \"kubernetes.io/projected/9da20332-c5f6-49ae-af6e-cabb1e166cf5-kube-api-access-bd7dm\") pod \"machine-config-controller-84d6567774-z6xvv\" (UID: \"9da20332-c5f6-49ae-af6e-cabb1e166cf5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126659 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/59865c39-3a5f-44cf-b6a8-ce0552bc8d0b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h6c75\" (UID: \"59865c39-3a5f-44cf-b6a8-ce0552bc8d0b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126675 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9da20332-c5f6-49ae-af6e-cabb1e166cf5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-z6xvv\" (UID: \"9da20332-c5f6-49ae-af6e-cabb1e166cf5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126714 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126762 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/59865c39-3a5f-44cf-b6a8-ce0552bc8d0b-srv-cert\") pod \"olm-operator-6b444d44fb-h6c75\" (UID: \"59865c39-3a5f-44cf-b6a8-ce0552bc8d0b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126782 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bfn2\" (UniqueName: \"kubernetes.io/projected/6818d850-0c23-481b-b3f5-fbb31275d97f-kube-api-access-2bfn2\") pod \"marketplace-operator-79b997595-pf9vb\" (UID: \"6818d850-0c23-481b-b3f5-fbb31275d97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126805 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0545b5e8-2fe4-474f-aa8b-a00964fc6237-profile-collector-cert\") pod \"catalog-operator-68c6474976-b5c8p\" (UID: \"0545b5e8-2fe4-474f-aa8b-a00964fc6237\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126826 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcd9x\" (UniqueName: \"kubernetes.io/projected/81aa4a3b-1c57-4883-b423-dc237393b801-kube-api-access-rcd9x\") pod \"machine-config-operator-74547568cd-vg2h6\" (UID: \"81aa4a3b-1c57-4883-b423-dc237393b801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126879 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/28dda709-c837-4e67-90be-7acda1dd093a-signing-cabundle\") pod \"service-ca-9c57cc56f-677dr\" (UID: \"28dda709-c837-4e67-90be-7acda1dd093a\") " pod="openshift-service-ca/service-ca-9c57cc56f-677dr" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.126914 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-bound-sa-token\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.127040 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ngjr\" (UniqueName: \"kubernetes.io/projected/aa5e9c31-c582-444e-97c2-9e285e2b75d4-kube-api-access-6ngjr\") pod \"control-plane-machine-set-operator-78cbb6b69f-5rj92\" (UID: \"aa5e9c31-c582-444e-97c2-9e285e2b75d4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.127160 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kttxv\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-kube-api-access-kttxv\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.127190 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81aa4a3b-1c57-4883-b423-dc237393b801-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vg2h6\" (UID: \"81aa4a3b-1c57-4883-b423-dc237393b801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.127217 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9da20332-c5f6-49ae-af6e-cabb1e166cf5-proxy-tls\") pod \"machine-config-controller-84d6567774-z6xvv\" (UID: \"9da20332-c5f6-49ae-af6e-cabb1e166cf5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.127241 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-service-ca-bundle\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.127261 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/321f8504-187a-46a9-b5bc-a27b93175e39-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b7btc\" (UID: \"321f8504-187a-46a9-b5bc-a27b93175e39\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.127350 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-l27kl\" (UID: \"9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.127476 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-stats-auth\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.127499 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98b9r\" (UniqueName: \"kubernetes.io/projected/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-kube-api-access-98b9r\") pod \"collect-profiles-29325255-8n5s2\" (UID: \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" Oct 03 18:16:39 crc kubenswrapper[4835]: E1003 18:16:39.131284 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:39.631264074 +0000 UTC m=+141.347204946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.134866 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.154374 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp"] Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.162261 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk"] Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.174934 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.180811 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prk5w" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.187394 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" Oct 03 18:16:39 crc kubenswrapper[4835]: W1003 18:16:39.190211 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod476e65c8_6293_4edf_b6a6_197763d8f7e1.slice/crio-e6e94e0d2983cfe01359870f219ccd95d29fa50a145a036365b50763976b7f19 WatchSource:0}: Error finding container e6e94e0d2983cfe01359870f219ccd95d29fa50a145a036365b50763976b7f19: Status 404 returned error can't find the container with id e6e94e0d2983cfe01359870f219ccd95d29fa50a145a036365b50763976b7f19 Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.228503 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:39 crc kubenswrapper[4835]: E1003 18:16:39.228646 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:39.728623977 +0000 UTC m=+141.444564849 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.228718 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81aa4a3b-1c57-4883-b423-dc237393b801-proxy-tls\") pod \"machine-config-operator-74547568cd-vg2h6\" (UID: \"81aa4a3b-1c57-4883-b423-dc237393b801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.228749 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltvm9\" (UniqueName: \"kubernetes.io/projected/0545b5e8-2fe4-474f-aa8b-a00964fc6237-kube-api-access-ltvm9\") pod \"catalog-operator-68c6474976-b5c8p\" (UID: \"0545b5e8-2fe4-474f-aa8b-a00964fc6237\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.228802 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6-config\") pod \"kube-apiserver-operator-766d6c64bb-l27kl\" (UID: \"9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.228837 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64724db7-de23-427c-a555-585e9c0b7173-cert\") pod \"ingress-canary-lf89q\" (UID: \"64724db7-de23-427c-a555-585e9c0b7173\") " pod="openshift-ingress-canary/ingress-canary-lf89q" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.228859 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxkhh\" (UniqueName: \"kubernetes.io/projected/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-kube-api-access-kxkhh\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.228878 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnsm4\" (UniqueName: \"kubernetes.io/projected/64724db7-de23-427c-a555-585e9c0b7173-kube-api-access-vnsm4\") pod \"ingress-canary-lf89q\" (UID: \"64724db7-de23-427c-a555-585e9c0b7173\") " pod="openshift-ingress-canary/ingress-canary-lf89q" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.229616 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa5e9c31-c582-444e-97c2-9e285e2b75d4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5rj92\" (UID: \"aa5e9c31-c582-444e-97c2-9e285e2b75d4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.229673 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-l27kl\" (UID: \"9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.229712 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmwg2\" (UniqueName: \"kubernetes.io/projected/ba033d0b-fb5e-4f72-8ac4-31ade9c07a25-kube-api-access-bmwg2\") pod \"dns-default-26hp9\" (UID: \"ba033d0b-fb5e-4f72-8ac4-31ade9c07a25\") " pod="openshift-dns/dns-default-26hp9" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.229784 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-plugins-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.229824 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.229873 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.229895 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-secret-volume\") pod \"collect-profiles-29325255-8n5s2\" (UID: \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.229932 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/078d8acd-04ed-453e-a67b-39efe6ea5bf9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fqtzs\" (UID: \"078d8acd-04ed-453e-a67b-39efe6ea5bf9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fqtzs" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.229954 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-registration-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.229979 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba033d0b-fb5e-4f72-8ac4-31ade9c07a25-config-volume\") pod \"dns-default-26hp9\" (UID: \"ba033d0b-fb5e-4f72-8ac4-31ade9c07a25\") " pod="openshift-dns/dns-default-26hp9" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.232678 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6-config\") pod \"kube-apiserver-operator-766d6c64bb-l27kl\" (UID: \"9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.233367 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.236149 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-trusted-ca\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.236190 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7pft\" (UniqueName: \"kubernetes.io/projected/28dda709-c837-4e67-90be-7acda1dd093a-kube-api-access-d7pft\") pod \"service-ca-9c57cc56f-677dr\" (UID: \"28dda709-c837-4e67-90be-7acda1dd093a\") " pod="openshift-service-ca/service-ca-9c57cc56f-677dr" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.236690 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6818d850-0c23-481b-b3f5-fbb31275d97f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pf9vb\" (UID: \"6818d850-0c23-481b-b3f5-fbb31275d97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.237447 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa5e9c31-c582-444e-97c2-9e285e2b75d4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5rj92\" (UID: \"aa5e9c31-c582-444e-97c2-9e285e2b75d4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.237752 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd7dm\" (UniqueName: \"kubernetes.io/projected/9da20332-c5f6-49ae-af6e-cabb1e166cf5-kube-api-access-bd7dm\") pod \"machine-config-controller-84d6567774-z6xvv\" (UID: \"9da20332-c5f6-49ae-af6e-cabb1e166cf5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.237789 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-mountpoint-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.246315 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-trusted-ca\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.247847 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-l27kl\" (UID: \"9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.247969 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-secret-volume\") pod \"collect-profiles-29325255-8n5s2\" (UID: \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.248283 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6h95\" (UniqueName: \"kubernetes.io/projected/789193fd-4e59-4291-9dd3-57edc8bfd700-kube-api-access-l6h95\") pod \"machine-config-server-lt54v\" (UID: \"789193fd-4e59-4291-9dd3-57edc8bfd700\") " pod="openshift-machine-config-operator/machine-config-server-lt54v" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.248382 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpgsh\" (UniqueName: \"kubernetes.io/projected/084d1738-c9cb-4d72-98e1-dbbd06e4b084-kube-api-access-rpgsh\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.248484 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81aa4a3b-1c57-4883-b423-dc237393b801-proxy-tls\") pod \"machine-config-operator-74547568cd-vg2h6\" (UID: \"81aa4a3b-1c57-4883-b423-dc237393b801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.249255 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6818d850-0c23-481b-b3f5-fbb31275d97f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pf9vb\" (UID: \"6818d850-0c23-481b-b3f5-fbb31275d97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.249884 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.250345 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/59865c39-3a5f-44cf-b6a8-ce0552bc8d0b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h6c75\" (UID: \"59865c39-3a5f-44cf-b6a8-ce0552bc8d0b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.256217 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9da20332-c5f6-49ae-af6e-cabb1e166cf5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-z6xvv\" (UID: \"9da20332-c5f6-49ae-af6e-cabb1e166cf5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.256396 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.257879 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9da20332-c5f6-49ae-af6e-cabb1e166cf5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-z6xvv\" (UID: \"9da20332-c5f6-49ae-af6e-cabb1e166cf5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.257926 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/59865c39-3a5f-44cf-b6a8-ce0552bc8d0b-srv-cert\") pod \"olm-operator-6b444d44fb-h6c75\" (UID: \"59865c39-3a5f-44cf-b6a8-ce0552bc8d0b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.258029 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bfn2\" (UniqueName: \"kubernetes.io/projected/6818d850-0c23-481b-b3f5-fbb31275d97f-kube-api-access-2bfn2\") pod \"marketplace-operator-79b997595-pf9vb\" (UID: \"6818d850-0c23-481b-b3f5-fbb31275d97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.258094 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0545b5e8-2fe4-474f-aa8b-a00964fc6237-profile-collector-cert\") pod \"catalog-operator-68c6474976-b5c8p\" (UID: \"0545b5e8-2fe4-474f-aa8b-a00964fc6237\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.258122 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcd9x\" (UniqueName: \"kubernetes.io/projected/81aa4a3b-1c57-4883-b423-dc237393b801-kube-api-access-rcd9x\") pod \"machine-config-operator-74547568cd-vg2h6\" (UID: \"81aa4a3b-1c57-4883-b423-dc237393b801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.262214 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/28dda709-c837-4e67-90be-7acda1dd093a-signing-cabundle\") pod \"service-ca-9c57cc56f-677dr\" (UID: \"28dda709-c837-4e67-90be-7acda1dd093a\") " pod="openshift-service-ca/service-ca-9c57cc56f-677dr" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.262452 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64724db7-de23-427c-a555-585e9c0b7173-cert\") pod \"ingress-canary-lf89q\" (UID: \"64724db7-de23-427c-a555-585e9c0b7173\") " pod="openshift-ingress-canary/ingress-canary-lf89q" Oct 03 18:16:39 crc kubenswrapper[4835]: E1003 18:16:39.262501 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:39.762484452 +0000 UTC m=+141.478425324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.262738 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/59865c39-3a5f-44cf-b6a8-ce0552bc8d0b-srv-cert\") pod \"olm-operator-6b444d44fb-h6c75\" (UID: \"59865c39-3a5f-44cf-b6a8-ce0552bc8d0b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.263018 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.263536 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/28dda709-c837-4e67-90be-7acda1dd093a-signing-cabundle\") pod \"service-ca-9c57cc56f-677dr\" (UID: \"28dda709-c837-4e67-90be-7acda1dd093a\") " pod="openshift-service-ca/service-ca-9c57cc56f-677dr" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.263710 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-bound-sa-token\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.263781 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ngjr\" (UniqueName: \"kubernetes.io/projected/aa5e9c31-c582-444e-97c2-9e285e2b75d4-kube-api-access-6ngjr\") pod \"control-plane-machine-set-operator-78cbb6b69f-5rj92\" (UID: \"aa5e9c31-c582-444e-97c2-9e285e2b75d4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.263814 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kttxv\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-kube-api-access-kttxv\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.263840 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81aa4a3b-1c57-4883-b423-dc237393b801-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vg2h6\" (UID: \"81aa4a3b-1c57-4883-b423-dc237393b801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.263871 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9da20332-c5f6-49ae-af6e-cabb1e166cf5-proxy-tls\") pod \"machine-config-controller-84d6567774-z6xvv\" (UID: \"9da20332-c5f6-49ae-af6e-cabb1e166cf5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.263896 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-service-ca-bundle\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.263957 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/321f8504-187a-46a9-b5bc-a27b93175e39-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b7btc\" (UID: \"321f8504-187a-46a9-b5bc-a27b93175e39\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.263992 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba033d0b-fb5e-4f72-8ac4-31ade9c07a25-metrics-tls\") pod \"dns-default-26hp9\" (UID: \"ba033d0b-fb5e-4f72-8ac4-31ade9c07a25\") " pod="openshift-dns/dns-default-26hp9" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264020 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/789193fd-4e59-4291-9dd3-57edc8bfd700-certs\") pod \"machine-config-server-lt54v\" (UID: \"789193fd-4e59-4291-9dd3-57edc8bfd700\") " pod="openshift-machine-config-operator/machine-config-server-lt54v" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264047 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-l27kl\" (UID: \"9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264100 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/789193fd-4e59-4291-9dd3-57edc8bfd700-node-bootstrap-token\") pod \"machine-config-server-lt54v\" (UID: \"789193fd-4e59-4291-9dd3-57edc8bfd700\") " pod="openshift-machine-config-operator/machine-config-server-lt54v" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264191 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-stats-auth\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264231 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98b9r\" (UniqueName: \"kubernetes.io/projected/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-kube-api-access-98b9r\") pod \"collect-profiles-29325255-8n5s2\" (UID: \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264282 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/28dda709-c837-4e67-90be-7acda1dd093a-signing-key\") pod \"service-ca-9c57cc56f-677dr\" (UID: \"28dda709-c837-4e67-90be-7acda1dd093a\") " pod="openshift-service-ca/service-ca-9c57cc56f-677dr" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264307 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-socket-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264349 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0545b5e8-2fe4-474f-aa8b-a00964fc6237-srv-cert\") pod \"catalog-operator-68c6474976-b5c8p\" (UID: \"0545b5e8-2fe4-474f-aa8b-a00964fc6237\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264372 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v88q9\" (UniqueName: \"kubernetes.io/projected/59865c39-3a5f-44cf-b6a8-ce0552bc8d0b-kube-api-access-v88q9\") pod \"olm-operator-6b444d44fb-h6c75\" (UID: \"59865c39-3a5f-44cf-b6a8-ce0552bc8d0b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264394 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-registry-certificates\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264456 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-default-certificate\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264480 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6818d850-0c23-481b-b3f5-fbb31275d97f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pf9vb\" (UID: \"6818d850-0c23-481b-b3f5-fbb31275d97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264499 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81aa4a3b-1c57-4883-b423-dc237393b801-images\") pod \"machine-config-operator-74547568cd-vg2h6\" (UID: \"81aa4a3b-1c57-4883-b423-dc237393b801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264531 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdlgj\" (UniqueName: \"kubernetes.io/projected/321f8504-187a-46a9-b5bc-a27b93175e39-kube-api-access-hdlgj\") pod \"package-server-manager-789f6589d5-b7btc\" (UID: \"321f8504-187a-46a9-b5bc-a27b93175e39\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264556 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-registry-tls\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264582 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-metrics-certs\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264628 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-config-volume\") pod \"collect-profiles-29325255-8n5s2\" (UID: \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264673 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkkh7\" (UniqueName: \"kubernetes.io/projected/078d8acd-04ed-453e-a67b-39efe6ea5bf9-kube-api-access-rkkh7\") pod \"multus-admission-controller-857f4d67dd-fqtzs\" (UID: \"078d8acd-04ed-453e-a67b-39efe6ea5bf9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fqtzs" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.264705 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-csi-data-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.267267 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0545b5e8-2fe4-474f-aa8b-a00964fc6237-profile-collector-cert\") pod \"catalog-operator-68c6474976-b5c8p\" (UID: \"0545b5e8-2fe4-474f-aa8b-a00964fc6237\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.271291 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/078d8acd-04ed-453e-a67b-39efe6ea5bf9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fqtzs\" (UID: \"078d8acd-04ed-453e-a67b-39efe6ea5bf9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fqtzs" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.272710 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/321f8504-187a-46a9-b5bc-a27b93175e39-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b7btc\" (UID: \"321f8504-187a-46a9-b5bc-a27b93175e39\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.283307 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-registry-certificates\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.283964 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-config-volume\") pod \"collect-profiles-29325255-8n5s2\" (UID: \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.286798 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81aa4a3b-1c57-4883-b423-dc237393b801-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vg2h6\" (UID: \"81aa4a3b-1c57-4883-b423-dc237393b801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.287491 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81aa4a3b-1c57-4883-b423-dc237393b801-images\") pod \"machine-config-operator-74547568cd-vg2h6\" (UID: \"81aa4a3b-1c57-4883-b423-dc237393b801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.288909 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-service-ca-bundle\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.289573 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-stats-auth\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.289789 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltvm9\" (UniqueName: \"kubernetes.io/projected/0545b5e8-2fe4-474f-aa8b-a00964fc6237-kube-api-access-ltvm9\") pod \"catalog-operator-68c6474976-b5c8p\" (UID: \"0545b5e8-2fe4-474f-aa8b-a00964fc6237\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.289989 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6818d850-0c23-481b-b3f5-fbb31275d97f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pf9vb\" (UID: \"6818d850-0c23-481b-b3f5-fbb31275d97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.290014 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9da20332-c5f6-49ae-af6e-cabb1e166cf5-proxy-tls\") pod \"machine-config-controller-84d6567774-z6xvv\" (UID: \"9da20332-c5f6-49ae-af6e-cabb1e166cf5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.290298 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-default-certificate\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.290543 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-registry-tls\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.290575 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/28dda709-c837-4e67-90be-7acda1dd093a-signing-key\") pod \"service-ca-9c57cc56f-677dr\" (UID: \"28dda709-c837-4e67-90be-7acda1dd093a\") " pod="openshift-service-ca/service-ca-9c57cc56f-677dr" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.290833 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0545b5e8-2fe4-474f-aa8b-a00964fc6237-srv-cert\") pod \"catalog-operator-68c6474976-b5c8p\" (UID: \"0545b5e8-2fe4-474f-aa8b-a00964fc6237\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.292651 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/59865c39-3a5f-44cf-b6a8-ce0552bc8d0b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-h6c75\" (UID: \"59865c39-3a5f-44cf-b6a8-ce0552bc8d0b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.296357 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnsm4\" (UniqueName: \"kubernetes.io/projected/64724db7-de23-427c-a555-585e9c0b7173-kube-api-access-vnsm4\") pod \"ingress-canary-lf89q\" (UID: \"64724db7-de23-427c-a555-585e9c0b7173\") " pod="openshift-ingress-canary/ingress-canary-lf89q" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.296826 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-metrics-certs\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.319430 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7pft\" (UniqueName: \"kubernetes.io/projected/28dda709-c837-4e67-90be-7acda1dd093a-kube-api-access-d7pft\") pod \"service-ca-9c57cc56f-677dr\" (UID: \"28dda709-c837-4e67-90be-7acda1dd093a\") " pod="openshift-service-ca/service-ca-9c57cc56f-677dr" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.338748 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.349355 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.350800 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd7dm\" (UniqueName: \"kubernetes.io/projected/9da20332-c5f6-49ae-af6e-cabb1e166cf5-kube-api-access-bd7dm\") pod \"machine-config-controller-84d6567774-z6xvv\" (UID: \"9da20332-c5f6-49ae-af6e-cabb1e166cf5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.354782 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-677dr" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.362848 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxkhh\" (UniqueName: \"kubernetes.io/projected/a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85-kube-api-access-kxkhh\") pod \"router-default-5444994796-4fnq6\" (UID: \"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85\") " pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.363137 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lf89q" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.366823 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.367484 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-registration-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.367512 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba033d0b-fb5e-4f72-8ac4-31ade9c07a25-config-volume\") pod \"dns-default-26hp9\" (UID: \"ba033d0b-fb5e-4f72-8ac4-31ade9c07a25\") " pod="openshift-dns/dns-default-26hp9" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.367542 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-mountpoint-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.367577 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6h95\" (UniqueName: \"kubernetes.io/projected/789193fd-4e59-4291-9dd3-57edc8bfd700-kube-api-access-l6h95\") pod \"machine-config-server-lt54v\" (UID: \"789193fd-4e59-4291-9dd3-57edc8bfd700\") " pod="openshift-machine-config-operator/machine-config-server-lt54v" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.367602 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpgsh\" (UniqueName: \"kubernetes.io/projected/084d1738-c9cb-4d72-98e1-dbbd06e4b084-kube-api-access-rpgsh\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.367701 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba033d0b-fb5e-4f72-8ac4-31ade9c07a25-metrics-tls\") pod \"dns-default-26hp9\" (UID: \"ba033d0b-fb5e-4f72-8ac4-31ade9c07a25\") " pod="openshift-dns/dns-default-26hp9" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.367720 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/789193fd-4e59-4291-9dd3-57edc8bfd700-certs\") pod \"machine-config-server-lt54v\" (UID: \"789193fd-4e59-4291-9dd3-57edc8bfd700\") " pod="openshift-machine-config-operator/machine-config-server-lt54v" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.367742 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/789193fd-4e59-4291-9dd3-57edc8bfd700-node-bootstrap-token\") pod \"machine-config-server-lt54v\" (UID: \"789193fd-4e59-4291-9dd3-57edc8bfd700\") " pod="openshift-machine-config-operator/machine-config-server-lt54v" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.367778 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-socket-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.367822 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-csi-data-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.367850 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmwg2\" (UniqueName: \"kubernetes.io/projected/ba033d0b-fb5e-4f72-8ac4-31ade9c07a25-kube-api-access-bmwg2\") pod \"dns-default-26hp9\" (UID: \"ba033d0b-fb5e-4f72-8ac4-31ade9c07a25\") " pod="openshift-dns/dns-default-26hp9" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.367877 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-plugins-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.368494 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-plugins-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: E1003 18:16:39.368646 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:39.868623446 +0000 UTC m=+141.584564318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.368690 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-registration-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.369433 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba033d0b-fb5e-4f72-8ac4-31ade9c07a25-config-volume\") pod \"dns-default-26hp9\" (UID: \"ba033d0b-fb5e-4f72-8ac4-31ade9c07a25\") " pod="openshift-dns/dns-default-26hp9" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.369487 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-mountpoint-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.370961 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-socket-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.371117 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/084d1738-c9cb-4d72-98e1-dbbd06e4b084-csi-data-dir\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.378047 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/789193fd-4e59-4291-9dd3-57edc8bfd700-node-bootstrap-token\") pod \"machine-config-server-lt54v\" (UID: \"789193fd-4e59-4291-9dd3-57edc8bfd700\") " pod="openshift-machine-config-operator/machine-config-server-lt54v" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.382867 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba033d0b-fb5e-4f72-8ac4-31ade9c07a25-metrics-tls\") pod \"dns-default-26hp9\" (UID: \"ba033d0b-fb5e-4f72-8ac4-31ade9c07a25\") " pod="openshift-dns/dns-default-26hp9" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.383718 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcd9x\" (UniqueName: \"kubernetes.io/projected/81aa4a3b-1c57-4883-b423-dc237393b801-kube-api-access-rcd9x\") pod \"machine-config-operator-74547568cd-vg2h6\" (UID: \"81aa4a3b-1c57-4883-b423-dc237393b801\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.384060 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/789193fd-4e59-4291-9dd3-57edc8bfd700-certs\") pod \"machine-config-server-lt54v\" (UID: \"789193fd-4e59-4291-9dd3-57edc8bfd700\") " pod="openshift-machine-config-operator/machine-config-server-lt54v" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.408110 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bfn2\" (UniqueName: \"kubernetes.io/projected/6818d850-0c23-481b-b3f5-fbb31275d97f-kube-api-access-2bfn2\") pod \"marketplace-operator-79b997595-pf9vb\" (UID: \"6818d850-0c23-481b-b3f5-fbb31275d97f\") " pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.423207 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98b9r\" (UniqueName: \"kubernetes.io/projected/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-kube-api-access-98b9r\") pod \"collect-profiles-29325255-8n5s2\" (UID: \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.446038 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-l27kl\" (UID: \"9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.453011 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f"] Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.465499 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-bound-sa-token\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.469300 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: E1003 18:16:39.469948 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:39.969935563 +0000 UTC m=+141.685876435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.488221 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkkh7\" (UniqueName: \"kubernetes.io/projected/078d8acd-04ed-453e-a67b-39efe6ea5bf9-kube-api-access-rkkh7\") pod \"multus-admission-controller-857f4d67dd-fqtzs\" (UID: \"078d8acd-04ed-453e-a67b-39efe6ea5bf9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fqtzs" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.510232 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" event={"ID":"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92","Type":"ContainerStarted","Data":"fc705e256f9bac48bacb782ffaf1ada5801d9df63f84d080756059e2512708cb"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.510346 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" event={"ID":"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92","Type":"ContainerStarted","Data":"4041f4c86ce990df901b75fc3fe0538717779c2387dd84c54bfa286d823c49b3"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.510356 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" event={"ID":"76f0bfed-fea0-4d28-bb03-2d3b0ae79d92","Type":"ContainerStarted","Data":"b42e6263bd1209cb0de9f4ad8cdcd4305a867c60d8912cf33c35b8323d372777"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.513401 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" event={"ID":"56ad1b37-3737-409f-a332-2676129348b6","Type":"ContainerStarted","Data":"2fee121ddd2da5905042760912d033ed46c052b86bd54067adf9938218a5860c"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.517969 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tz58z" event={"ID":"160f473b-3942-4848-affa-bb555f0068bc","Type":"ContainerStarted","Data":"36db9c4762e02f356512fec1d12deff790145ee1a4aa464693baf5dfab5f7d43"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.518000 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tz58z" event={"ID":"160f473b-3942-4848-affa-bb555f0068bc","Type":"ContainerStarted","Data":"a9678d251145af534626e2c5e6e3a1a20d52e22b9afa50517c58be6c6bcd4d5f"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.518732 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.520123 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" event={"ID":"f3ea7745-8aa0-4bcd-86e4-326313d026bd","Type":"ContainerStarted","Data":"0d502cc88644f6f01e4f0761f4c75bc2338ea71c726786c4ec90321e9f2397aa"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.522195 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v88q9\" (UniqueName: \"kubernetes.io/projected/59865c39-3a5f-44cf-b6a8-ce0552bc8d0b-kube-api-access-v88q9\") pod \"olm-operator-6b444d44fb-h6c75\" (UID: \"59865c39-3a5f-44cf-b6a8-ce0552bc8d0b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.530222 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz" event={"ID":"13a81cd4-6b06-4a2b-a2c2-4a0778518313","Type":"ContainerStarted","Data":"5741e46d045e64f787c3e558a320dbde227d8616982aaacb7c9649de4cc573fa"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.530254 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz" event={"ID":"13a81cd4-6b06-4a2b-a2c2-4a0778518313","Type":"ContainerStarted","Data":"2a5e7268d04a80d354757c3c9deac85b9886987a489aac3d8a3600a13b6e6218"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.530264 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz" event={"ID":"13a81cd4-6b06-4a2b-a2c2-4a0778518313","Type":"ContainerStarted","Data":"802dc8ea17fdc0e0d9dbccac771558e0acd03006fb5d47fe83318523c0833ae2"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.531332 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" event={"ID":"476e65c8-6293-4edf-b6a6-197763d8f7e1","Type":"ContainerStarted","Data":"e6e94e0d2983cfe01359870f219ccd95d29fa50a145a036365b50763976b7f19"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.533732 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" event={"ID":"56bc4f0f-9acd-4179-be10-9ad383cbf689","Type":"ContainerStarted","Data":"aa86304a613a1489be20313b0666072fa28a152d30d8f9a1dd1a8346daec815f"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.535744 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" event={"ID":"6b59d837-ca72-447d-8b77-42675b0ec49b","Type":"ContainerStarted","Data":"26d88cf13570a96d11c043b5b67c91bf7a957348047ab1d25b9e47af2820fc99"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.535773 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" event={"ID":"6b59d837-ca72-447d-8b77-42675b0ec49b","Type":"ContainerStarted","Data":"7577de618cd02bd4bf6639a18bd116fe42cab8d9dbac69609e75252193e0268e"} Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.536201 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.536776 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ngjr\" (UniqueName: \"kubernetes.io/projected/aa5e9c31-c582-444e-97c2-9e285e2b75d4-kube-api-access-6ngjr\") pod \"control-plane-machine-set-operator-78cbb6b69f-5rj92\" (UID: \"aa5e9c31-c582-444e-97c2-9e285e2b75d4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.540848 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.541667 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kttxv\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-kube-api-access-kttxv\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.555759 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.557492 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdlgj\" (UniqueName: \"kubernetes.io/projected/321f8504-187a-46a9-b5bc-a27b93175e39-kube-api-access-hdlgj\") pod \"package-server-manager-789f6589d5-b7btc\" (UID: \"321f8504-187a-46a9-b5bc-a27b93175e39\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.570147 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:39 crc kubenswrapper[4835]: E1003 18:16:39.570560 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.070544592 +0000 UTC m=+141.786485464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.571930 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.574799 4835 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mr45m container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.574846 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" podUID="6b59d837-ca72-447d-8b77-42675b0ec49b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.575633 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-tz58z container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.575778 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tz58z" podUID="160f473b-3942-4848-affa-bb555f0068bc" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.576356 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fqtzs" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.584672 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.597585 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.606950 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.616251 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.621176 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.621627 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpgsh\" (UniqueName: \"kubernetes.io/projected/084d1738-c9cb-4d72-98e1-dbbd06e4b084-kube-api-access-rpgsh\") pod \"csi-hostpathplugin-bv6gz\" (UID: \"084d1738-c9cb-4d72-98e1-dbbd06e4b084\") " pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.624748 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6h95\" (UniqueName: \"kubernetes.io/projected/789193fd-4e59-4291-9dd3-57edc8bfd700-kube-api-access-l6h95\") pod \"machine-config-server-lt54v\" (UID: \"789193fd-4e59-4291-9dd3-57edc8bfd700\") " pod="openshift-machine-config-operator/machine-config-server-lt54v" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.629966 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.643964 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmwg2\" (UniqueName: \"kubernetes.io/projected/ba033d0b-fb5e-4f72-8ac4-31ade9c07a25-kube-api-access-bmwg2\") pod \"dns-default-26hp9\" (UID: \"ba033d0b-fb5e-4f72-8ac4-31ade9c07a25\") " pod="openshift-dns/dns-default-26hp9" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.656033 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj"] Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.656968 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qgmg8"] Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.670518 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lt54v" Oct 03 18:16:39 crc kubenswrapper[4835]: E1003 18:16:39.671015 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3ea7745_8aa0_4bcd_86e4_326313d026bd.slice/crio-0e326844b01fd3a72decc6a4a0b1537e3b0b23c4f6c100fa9adefe528658c935.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3ea7745_8aa0_4bcd_86e4_326313d026bd.slice/crio-conmon-0e326844b01fd3a72decc6a4a0b1537e3b0b23c4f6c100fa9adefe528658c935.scope\": RecentStats: unable to find data in memory cache]" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.671464 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: E1003 18:16:39.673239 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.173226635 +0000 UTC m=+141.889167507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.678757 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-26hp9" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.694770 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.754055 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-gnrjg"] Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.772189 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:39 crc kubenswrapper[4835]: E1003 18:16:39.772827 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.272805186 +0000 UTC m=+141.988746058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.772921 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: E1003 18:16:39.773259 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.273251129 +0000 UTC m=+141.989192001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.841091 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4tmrr"] Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.849889 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8nwfg"] Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.855061 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb"] Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.876006 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:39 crc kubenswrapper[4835]: E1003 18:16:39.876428 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.376405784 +0000 UTC m=+142.092346656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.876521 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:39 crc kubenswrapper[4835]: E1003 18:16:39.876902 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.376895427 +0000 UTC m=+142.092836299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:39 crc kubenswrapper[4835]: I1003 18:16:39.983382 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7"] Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.005584 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v7gmf"] Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.005653 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.005934 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.505888596 +0000 UTC m=+142.221829468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.005986 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.006530 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.506521802 +0000 UTC m=+142.222462674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.007781 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq"] Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.032876 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mcjr4"] Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.109594 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.110004 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.609988906 +0000 UTC m=+142.325929778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.128415 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-prk5w"] Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.132458 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p7wfs"] Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.203695 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth"] Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.211126 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.211523 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.711508129 +0000 UTC m=+142.427449011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.235204 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2"] Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.315178 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-pglch"] Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.317252 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.317931 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.817912941 +0000 UTC m=+142.533853813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.319609 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.319956 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.819945894 +0000 UTC m=+142.535886766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: W1003 18:16:40.332529 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefa22703_0a66_407f_9f2e_333bac190ce8.slice/crio-3de1a64d88f915cfe1c5bdf19a30ca01fe561d6db9f5f164be9235605f398993 WatchSource:0}: Error finding container 3de1a64d88f915cfe1c5bdf19a30ca01fe561d6db9f5f164be9235605f398993: Status 404 returned error can't find the container with id 3de1a64d88f915cfe1c5bdf19a30ca01fe561d6db9f5f164be9235605f398993 Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.422064 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.422323 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.922275278 +0000 UTC m=+142.638216150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.426128 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.426641 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:40.926624353 +0000 UTC m=+142.642565225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.526866 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.527674 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:41.027655954 +0000 UTC m=+142.743596826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.596319 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prk5w" event={"ID":"b2b48efb-bafd-45af-b56e-4568f2416af8","Type":"ContainerStarted","Data":"003b18b38dedfd1b3ab20701f77d7f9cd235b03991ef536588c0ff3c7ba77893"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.604936 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s"] Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.608039 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8nwfg" event={"ID":"300d2397-b9b1-4f44-9eb2-5757940cc64c","Type":"ContainerStarted","Data":"121c5ddf04475b188997326eedc59f80a16b5d56cb32de82704fe42cee031f18"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.608106 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8nwfg" event={"ID":"300d2397-b9b1-4f44-9eb2-5757940cc64c","Type":"ContainerStarted","Data":"ced84b48368a0a676d5a7a87e498c5a9ab595c059b9d76ccce4bef1a40fd55f2"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.616636 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qgmg8" event={"ID":"16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7","Type":"ContainerStarted","Data":"2cbfbd2b162e57d7c118ce51f5d4c1662faec6443fb521b027149cd526c1c77d"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.618525 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" event={"ID":"2cc40a88-90de-40be-b285-4b7f8bd11709","Type":"ContainerStarted","Data":"7e1d692e33eee0d50e222bdb1d8012b4f8d7e40dcaef311f93a7b00b43743554"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.620389 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-677dr"] Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.634156 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.634717 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:41.134702782 +0000 UTC m=+142.850643654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.636112 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" event={"ID":"24baba3b-d1f1-426a-88a9-9bd5cb44112d","Type":"ContainerStarted","Data":"2bf0215b436abf71b9bab78b67b206c635320de2c728506781ebe65cac3dead9"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.640635 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" event={"ID":"476e65c8-6293-4edf-b6a6-197763d8f7e1","Type":"ContainerStarted","Data":"93d91633cf58b058827993c9b06afdc32e18e0e3ee2957673f6f15bd8a4340f5"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.644167 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-4fnq6" event={"ID":"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85","Type":"ContainerStarted","Data":"20d42595ec61d92d4ed81101db5a0053c9ae8f247e778963b9d29d614df63158"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.655798 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lt54v" event={"ID":"789193fd-4e59-4291-9dd3-57edc8bfd700","Type":"ContainerStarted","Data":"6519f0790313183970715091c2c0f8b790028586f5632435d81630a965556f35"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.668318 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" event={"ID":"96ad30f4-8507-4530-af53-06c628b1388e","Type":"ContainerStarted","Data":"7a9ee5f090794d0cb9eb68fcef4e58a028bd74a57b06c85ab2a210eca415d7dc"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.668357 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" event={"ID":"96ad30f4-8507-4530-af53-06c628b1388e","Type":"ContainerStarted","Data":"251cbb247462efe1e09c565e2d28192fdce349f07a4a33139f6f4f4b7bbba927"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.677267 4835 generic.go:334] "Generic (PLEG): container finished" podID="56bc4f0f-9acd-4179-be10-9ad383cbf689" containerID="9aa30e364b3e4b9dd25cc8eb7b14d169bd62862be95e2d7f9346f8abf7104cb1" exitCode=0 Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.677382 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" event={"ID":"56bc4f0f-9acd-4179-be10-9ad383cbf689","Type":"ContainerDied","Data":"9aa30e364b3e4b9dd25cc8eb7b14d169bd62862be95e2d7f9346f8abf7104cb1"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.680237 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" event={"ID":"56ad1b37-3737-409f-a332-2676129348b6","Type":"ContainerStarted","Data":"ecdef8ea76feaa10794ce442f23dedccd7c233712c837e4167eeee7b04692dc4"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.682301 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" event={"ID":"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1","Type":"ContainerStarted","Data":"90a68454c9dd918a0718e30f0828bd4898878e7f7829604fee393e9c0c36a941"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.685919 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" event={"ID":"f0eca3a7-03cd-4126-83d7-b15db1a7232f","Type":"ContainerStarted","Data":"7db8cc229d034c5d771e680cafd8de22a22531e1f5a847e505473e2ae80f6c0c"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.688784 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" event={"ID":"1bc1c2c4-ec18-46df-b876-157c08bbde36","Type":"ContainerStarted","Data":"5b3d5a4c40aab3025d26558f6ad58fb03dfdd7689e521a8bb583bc97b4974502"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.688866 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" event={"ID":"1bc1c2c4-ec18-46df-b876-157c08bbde36","Type":"ContainerStarted","Data":"cfe23d16957aa0a02d28dc3b36c1101f47466b0801d11c454c72bc910ed04585"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.736713 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.737208 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:41.23719442 +0000 UTC m=+142.953135292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.749431 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mcjr4" event={"ID":"efa22703-0a66-407f-9f2e-333bac190ce8","Type":"ContainerStarted","Data":"3de1a64d88f915cfe1c5bdf19a30ca01fe561d6db9f5f164be9235605f398993"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.837847 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.846055 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:41.346037257 +0000 UTC m=+143.061978119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.859837 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6"] Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.879728 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p"] Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.881674 4835 generic.go:334] "Generic (PLEG): container finished" podID="f3ea7745-8aa0-4bcd-86e4-326313d026bd" containerID="0e326844b01fd3a72decc6a4a0b1537e3b0b23c4f6c100fa9adefe528658c935" exitCode=0 Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.881778 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" event={"ID":"f3ea7745-8aa0-4bcd-86e4-326313d026bd","Type":"ContainerDied","Data":"0e326844b01fd3a72decc6a4a0b1537e3b0b23c4f6c100fa9adefe528658c935"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.887006 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" podStartSLOduration=121.886987368 podStartE2EDuration="2m1.886987368s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:40.885325325 +0000 UTC m=+142.601266197" watchObservedRunningTime="2025-10-03 18:16:40.886987368 +0000 UTC m=+142.602928230" Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.919424 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" event={"ID":"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0","Type":"ContainerStarted","Data":"41be3e5e4a4a9efaef0d63d3e11b3e787f6621e572bb8912680fb5fb4f426bc7"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.921088 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.921121 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" event={"ID":"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0","Type":"ContainerStarted","Data":"909cab64f19c4d7bd96b8881c6133c3f09fedab3827d09c084d01927f5a335e8"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.928308 4835 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4tmrr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.928383 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" podUID="13e2e684-1dc3-4ea7-89a9-05dabb52b7f0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.940183 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.940399 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:41.440333538 +0000 UTC m=+143.156274410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.940667 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.941102 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" event={"ID":"8ebdd246-dcb5-4785-b1a4-7133f6317e91","Type":"ContainerStarted","Data":"08e0823cfdb05755c152008775b33fb52e30b0d8149f3f992be2119d05cee5d0"} Oct 03 18:16:40 crc kubenswrapper[4835]: E1003 18:16:40.942511 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:41.442501436 +0000 UTC m=+143.158442308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.946186 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" event={"ID":"810a8fd3-d63d-4fd1-b6f1-186457e8878a","Type":"ContainerStarted","Data":"8d7f726980351b1158ebaa5d8680efdd1b86ce6a33ee3b3d7717ce3bbe3bcb30"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.978705 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" event={"ID":"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47","Type":"ContainerStarted","Data":"096f4a9e91f315ae23c9dfc4c51634107a3dd703cc32d5482d55a00d7fc81cc2"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.997942 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" event={"ID":"3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8","Type":"ContainerStarted","Data":"b25aee430705e142b744c724809b669f5fe0b86a045a78e7285734d8e5b7b532"} Oct 03 18:16:40 crc kubenswrapper[4835]: I1003 18:16:40.997987 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" event={"ID":"3171b8bf-5c2b-4e0f-9ca2-a2a879ad00f8","Type":"ContainerStarted","Data":"b0f94fc0e220705712aeb36f6347276be468015d74e9b945135ca454f391f47f"} Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.013207 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.013326 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-tz58z" Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.042170 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:41 crc kubenswrapper[4835]: E1003 18:16:41.042337 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:41.542312663 +0000 UTC m=+143.258253535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.047472 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:41 crc kubenswrapper[4835]: E1003 18:16:41.050288 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:41.550272094 +0000 UTC m=+143.266212966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.133461 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv"] Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.154475 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.158020 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fqtzs"] Oct 03 18:16:41 crc kubenswrapper[4835]: E1003 18:16:41.168233 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:41.668200039 +0000 UTC m=+143.384140901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.197844 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lf89q"] Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.215347 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl"] Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.217180 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pf9vb"] Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.238121 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bkfmz" podStartSLOduration=123.236052952 podStartE2EDuration="2m3.236052952s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:41.21213717 +0000 UTC m=+142.928078042" watchObservedRunningTime="2025-10-03 18:16:41.236052952 +0000 UTC m=+142.951993824" Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.259832 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:41 crc kubenswrapper[4835]: E1003 18:16:41.260151 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:41.760139799 +0000 UTC m=+143.476080671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.268715 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bv6gz"] Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.295288 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92"] Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.356582 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2"] Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.369680 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:41 crc kubenswrapper[4835]: E1003 18:16:41.370323 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:41.870271569 +0000 UTC m=+143.586212441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.374097 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-26hp9"] Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.386979 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc"] Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.406687 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75"] Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.459988 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-v7t7j" podStartSLOduration=122.459966989 podStartE2EDuration="2m2.459966989s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:41.45659794 +0000 UTC m=+143.172538832" watchObservedRunningTime="2025-10-03 18:16:41.459966989 +0000 UTC m=+143.175907871" Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.471043 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:41 crc kubenswrapper[4835]: E1003 18:16:41.471555 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:41.971542726 +0000 UTC m=+143.687483598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:41 crc kubenswrapper[4835]: W1003 18:16:41.497273 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59865c39_3a5f_44cf_b6a8_ce0552bc8d0b.slice/crio-20bd008556e604a0e469b84559289e47f9ca156ae721359193ce07c7304c4758 WatchSource:0}: Error finding container 20bd008556e604a0e469b84559289e47f9ca156ae721359193ce07c7304c4758: Status 404 returned error can't find the container with id 20bd008556e604a0e469b84559289e47f9ca156ae721359193ce07c7304c4758 Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.534854 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-tz58z" podStartSLOduration=123.534812197 podStartE2EDuration="2m3.534812197s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:41.532563437 +0000 UTC m=+143.248504319" watchObservedRunningTime="2025-10-03 18:16:41.534812197 +0000 UTC m=+143.250753059" Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.572298 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:41 crc kubenswrapper[4835]: E1003 18:16:41.572670 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:42.072655277 +0000 UTC m=+143.788596139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.579617 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ttggp" podStartSLOduration=122.579593481 podStartE2EDuration="2m2.579593481s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:41.577005342 +0000 UTC m=+143.292946214" watchObservedRunningTime="2025-10-03 18:16:41.579593481 +0000 UTC m=+143.295534353" Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.675343 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:41 crc kubenswrapper[4835]: E1003 18:16:41.675636 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:42.175624118 +0000 UTC m=+143.891564980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.744485 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lbkjb" podStartSLOduration=123.744467218 podStartE2EDuration="2m3.744467218s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:41.742198787 +0000 UTC m=+143.458139659" watchObservedRunningTime="2025-10-03 18:16:41.744467218 +0000 UTC m=+143.460408080" Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.782317 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:41 crc kubenswrapper[4835]: E1003 18:16:41.782623 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:42.282596055 +0000 UTC m=+143.998536927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.783095 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.783399 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-8nwfg" podStartSLOduration=123.783375235 podStartE2EDuration="2m3.783375235s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:41.770412293 +0000 UTC m=+143.486353155" watchObservedRunningTime="2025-10-03 18:16:41.783375235 +0000 UTC m=+143.499316107" Oct 03 18:16:41 crc kubenswrapper[4835]: E1003 18:16:41.784028 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:42.284004722 +0000 UTC m=+143.999945594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.874521 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvkvj" podStartSLOduration=123.874498513 podStartE2EDuration="2m3.874498513s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:41.870417916 +0000 UTC m=+143.586358788" watchObservedRunningTime="2025-10-03 18:16:41.874498513 +0000 UTC m=+143.590439375" Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.890237 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:41 crc kubenswrapper[4835]: E1003 18:16:41.890532 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:42.390516557 +0000 UTC m=+144.106457429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:41 crc kubenswrapper[4835]: I1003 18:16:41.991276 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:41 crc kubenswrapper[4835]: E1003 18:16:41.991740 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:42.491720491 +0000 UTC m=+144.207661363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.007143 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xjd4f" podStartSLOduration=123.007111197 podStartE2EDuration="2m3.007111197s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:41.93190877 +0000 UTC m=+143.647849642" watchObservedRunningTime="2025-10-03 18:16:42.007111197 +0000 UTC m=+143.723052069" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.009235 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" podStartSLOduration=124.009226394 podStartE2EDuration="2m4.009226394s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.006213964 +0000 UTC m=+143.722154836" watchObservedRunningTime="2025-10-03 18:16:42.009226394 +0000 UTC m=+143.725167266" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.084203 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" event={"ID":"32005ae4-91c7-48c7-a713-59738b849926","Type":"ContainerStarted","Data":"84dd28eebfbe7741248d60b3de3ab4917f0875791e36f955f65f5210985b9289"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.085124 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" event={"ID":"32005ae4-91c7-48c7-a713-59738b849926","Type":"ContainerStarted","Data":"daeec537f43fdb9b237a4fc72d53de67646f38adfb952e2bafddc8bbf2f7d248"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.085140 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.086773 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qgmg8" event={"ID":"16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7","Type":"ContainerStarted","Data":"0ff55e4ec5385e24c79912a4200ea3b497cf95bc902ec8be01dca66bec4f7a8c"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.086798 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qgmg8" event={"ID":"16d04d7d-2f2d-4bf7-a65f-0bda7bec3fb7","Type":"ContainerStarted","Data":"b17509acc02c45be1917e5031da29f014e1bb2c30b002b1a910834cc91a77bf5"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.089563 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fqtzs" event={"ID":"078d8acd-04ed-453e-a67b-39efe6ea5bf9","Type":"ContainerStarted","Data":"066450ef6df11a2c927ea898cec0dcdf3123ceecd2bedbd232880818e1f3b189"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.091542 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:42 crc kubenswrapper[4835]: E1003 18:16:42.091836 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:42.591820116 +0000 UTC m=+144.307760978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.092951 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" event={"ID":"7807d4de-42f2-489a-af4d-0317ebbc154c","Type":"ContainerStarted","Data":"511872d25cf2314faf4a3d8b92ba49ff2923e98d81ec2e80b8f2810fbd7caf60"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.092993 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" event={"ID":"7807d4de-42f2-489a-af4d-0317ebbc154c","Type":"ContainerStarted","Data":"07edda86304bd27584977da7eb84b91caf2f8c300f52cf26d4a0b4c950cd2f78"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.096192 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" event={"ID":"81aa4a3b-1c57-4883-b423-dc237393b801","Type":"ContainerStarted","Data":"d084dc2bd3790beefd5692b0d518401771392f5c9144ad69ba5ccb819f8de4ab"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.096227 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" event={"ID":"81aa4a3b-1c57-4883-b423-dc237393b801","Type":"ContainerStarted","Data":"6d1bbf6b4c2875fecc95a163cf1a8b7f3e19a69bcaae8f468e862e52265d2d37"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.096237 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" event={"ID":"81aa4a3b-1c57-4883-b423-dc237393b801","Type":"ContainerStarted","Data":"df280680502b189c029afae2915d20e0160c362416c2aca5562907accb0e48fe"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.098754 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92" event={"ID":"aa5e9c31-c582-444e-97c2-9e285e2b75d4","Type":"ContainerStarted","Data":"d2c817147607d8c978bfb1be6b8cf0ac702da06febc9e25d26d3d8325870fdca"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.108151 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" podStartSLOduration=123.108136077 podStartE2EDuration="2m3.108136077s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.106477803 +0000 UTC m=+143.822418675" watchObservedRunningTime="2025-10-03 18:16:42.108136077 +0000 UTC m=+143.824076949" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.109947 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jxp2s container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.110086 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" podUID="32005ae4-91c7-48c7-a713-59738b849926" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.116152 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92" event={"ID":"aa5e9c31-c582-444e-97c2-9e285e2b75d4","Type":"ContainerStarted","Data":"33b3a2ef46f951a832b572677d1256bbb480e3b379a1f3e78b64a32f2fc0becc"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.116222 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mcjr4" event={"ID":"efa22703-0a66-407f-9f2e-333bac190ce8","Type":"ContainerStarted","Data":"f58773838d74ec7d79a91015b6f00274ef0700661481a225d7e571f1b20928af"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.116242 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" event={"ID":"084d1738-c9cb-4d72-98e1-dbbd06e4b084","Type":"ContainerStarted","Data":"1ab299ac0d6b78dbb9038452902ad5d40c29484e776489a78ed18dfd3705dbfd"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.116274 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mcjr4" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.124383 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcjr4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.124424 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mcjr4" podUID="efa22703-0a66-407f-9f2e-333bac190ce8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.131766 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" event={"ID":"0545b5e8-2fe4-474f-aa8b-a00964fc6237","Type":"ContainerStarted","Data":"6b41af0b285f94d4f156d3818944585af8de01f5ff78d89677610d1cf4b152a3"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.131813 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" event={"ID":"0545b5e8-2fe4-474f-aa8b-a00964fc6237","Type":"ContainerStarted","Data":"9a67255554582acbf90eafaa15518f3214a42ac9139e8d1df1e97ead7d5577fa"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.133203 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.147523 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mcjr4" podStartSLOduration=124.147502617 podStartE2EDuration="2m4.147502617s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.146488101 +0000 UTC m=+143.862428973" watchObservedRunningTime="2025-10-03 18:16:42.147502617 +0000 UTC m=+143.863443489" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.150126 4835 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-b5c8p container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.152903 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" podUID="0545b5e8-2fe4-474f-aa8b-a00964fc6237" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.151776 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" event={"ID":"8cb17f76-674e-4cf7-8f87-9af6942bc5c3","Type":"ContainerStarted","Data":"8fbe358286054a65ed84e52578e85518c8a89fe8ed68f095049ee56c03c4cc95"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.155308 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" event={"ID":"59865c39-3a5f-44cf-b6a8-ce0552bc8d0b","Type":"ContainerStarted","Data":"20bd008556e604a0e469b84559289e47f9ca156ae721359193ce07c7304c4758"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.174472 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-qgmg8" podStartSLOduration=124.17445274 podStartE2EDuration="2m4.17445274s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.173844523 +0000 UTC m=+143.889785395" watchObservedRunningTime="2025-10-03 18:16:42.17445274 +0000 UTC m=+143.890393612" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.190478 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" event={"ID":"9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6","Type":"ContainerStarted","Data":"7966560d6fc75fae887264c24a684bfe2414ab84685530a85cd2755f6607603c"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.193603 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:42 crc kubenswrapper[4835]: E1003 18:16:42.194659 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:42.694641443 +0000 UTC m=+144.410582315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.236218 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" event={"ID":"56bc4f0f-9acd-4179-be10-9ad383cbf689","Type":"ContainerStarted","Data":"e75301d6d14c727a497dc88747430c34fc2dd16e621356115b9ac602fada158f"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.236752 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.241415 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" event={"ID":"476e65c8-6293-4edf-b6a6-197763d8f7e1","Type":"ContainerStarted","Data":"9154c99b496f074b3256bd125bfa5e8110bdc128bbe803bb4dea1b586153b7cf"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.254813 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5rj92" podStartSLOduration=123.254792182 podStartE2EDuration="2m3.254792182s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.2547134 +0000 UTC m=+143.970654272" watchObservedRunningTime="2025-10-03 18:16:42.254792182 +0000 UTC m=+143.970733044" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.255712 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lt54v" event={"ID":"789193fd-4e59-4291-9dd3-57edc8bfd700","Type":"ContainerStarted","Data":"7344c6bbad7d34dab9b0a0ccf825f4d8dbfa36d13137175e74347136939dd28e"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.256659 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-pglch" podStartSLOduration=123.256651711 podStartE2EDuration="2m3.256651711s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.215992037 +0000 UTC m=+143.931932909" watchObservedRunningTime="2025-10-03 18:16:42.256651711 +0000 UTC m=+143.972592583" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.261853 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" event={"ID":"8ebdd246-dcb5-4785-b1a4-7133f6317e91","Type":"ContainerStarted","Data":"39a1faf522cc29b1e6e2a82c7e0f05e0c1ef5f0e5dd036b7c17a989c2e56b585"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.261888 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" event={"ID":"8ebdd246-dcb5-4785-b1a4-7133f6317e91","Type":"ContainerStarted","Data":"959a4c22a123b4787728acafd0e10b688e32e8ec2169cab912f5da744e4115b7"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.299772 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:42 crc kubenswrapper[4835]: E1003 18:16:42.300797 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:42.800783038 +0000 UTC m=+144.516723900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.314856 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" event={"ID":"6cd9b9f8-6098-4b34-8bbd-f9be4e2aeb47","Type":"ContainerStarted","Data":"6fd4b0590adee7d37f05a6c68c4e3b81dee904d483a6a514a66f0449fa2f148c"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.357025 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" podStartSLOduration=124.357003753 podStartE2EDuration="2m4.357003753s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.356263424 +0000 UTC m=+144.072204306" watchObservedRunningTime="2025-10-03 18:16:42.357003753 +0000 UTC m=+144.072944625" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.357542 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vg2h6" podStartSLOduration=123.357537427 podStartE2EDuration="2m3.357537427s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.301294901 +0000 UTC m=+144.017235773" watchObservedRunningTime="2025-10-03 18:16:42.357537427 +0000 UTC m=+144.073478289" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.362502 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" event={"ID":"f0eca3a7-03cd-4126-83d7-b15db1a7232f","Type":"ContainerStarted","Data":"05d81d56ce7c4fa577fc06f02fe705aad4f73b234a665d915d9594f6fb9456da"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.403908 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:42 crc kubenswrapper[4835]: E1003 18:16:42.404257 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:42.904243992 +0000 UTC m=+144.620184864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.430861 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-lt54v" podStartSLOduration=6.430847745 podStartE2EDuration="6.430847745s" podCreationTimestamp="2025-10-03 18:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.429885359 +0000 UTC m=+144.145826231" watchObservedRunningTime="2025-10-03 18:16:42.430847745 +0000 UTC m=+144.146788617" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.432171 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-95g8t" podStartSLOduration=124.43216671 podStartE2EDuration="2m4.43216671s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.405971877 +0000 UTC m=+144.121912749" watchObservedRunningTime="2025-10-03 18:16:42.43216671 +0000 UTC m=+144.148107582" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.440704 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prk5w" event={"ID":"b2b48efb-bafd-45af-b56e-4568f2416af8","Type":"ContainerStarted","Data":"ef3408fc3fbfdbe3c10f50ed2f59d030e9920112fd954a4dd7fb927b41950813"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.440763 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prk5w" event={"ID":"b2b48efb-bafd-45af-b56e-4568f2416af8","Type":"ContainerStarted","Data":"0dbb71b3b6d9a78e299210761793944813c100c834ab849ad985304e8170b669"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.486162 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-26hp9" event={"ID":"ba033d0b-fb5e-4f72-8ac4-31ade9c07a25","Type":"ContainerStarted","Data":"1de3d5de1417b09484fa1ad2dbfcbb3cb32d6baf6825005291726f4041214863"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.505784 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" podStartSLOduration=123.505761354 podStartE2EDuration="2m3.505761354s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.459473221 +0000 UTC m=+144.175414093" watchObservedRunningTime="2025-10-03 18:16:42.505761354 +0000 UTC m=+144.221702226" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.507741 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" event={"ID":"f3ea7745-8aa0-4bcd-86e4-326313d026bd","Type":"ContainerStarted","Data":"200799decfad972e9cbbfb9ac7df2ea2c6263d4478b16bc352cd09510d8a73cd"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.508053 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:42 crc kubenswrapper[4835]: E1003 18:16:42.509309 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:43.009288888 +0000 UTC m=+144.725229760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.537558 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dbvth" podStartSLOduration=124.537539354 podStartE2EDuration="2m4.537539354s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.507398658 +0000 UTC m=+144.223339530" watchObservedRunningTime="2025-10-03 18:16:42.537539354 +0000 UTC m=+144.253480216" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.543467 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prk5w" podStartSLOduration=123.54344893 podStartE2EDuration="2m3.54344893s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.536920758 +0000 UTC m=+144.252861630" watchObservedRunningTime="2025-10-03 18:16:42.54344893 +0000 UTC m=+144.259389802" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.545035 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-4fnq6" event={"ID":"a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85","Type":"ContainerStarted","Data":"b8335b59e7d67edb777aa9f7e35b7948b59aeb3693659d49d7f2ceb4fffa088c"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.630429 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-677dr" event={"ID":"28dda709-c837-4e67-90be-7acda1dd093a","Type":"ContainerStarted","Data":"46040a35dce22328d58921ea5103457ef3be34bcc49f134e8614f26511b9d337"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.630726 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-677dr" event={"ID":"28dda709-c837-4e67-90be-7acda1dd093a","Type":"ContainerStarted","Data":"42aec68da1931e66e073829c456b16ee744b6c91237ce3373a50f41b4ecb4383"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.631163 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.631636 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:42 crc kubenswrapper[4835]: E1003 18:16:42.639835 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:43.139819997 +0000 UTC m=+144.855760869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.642853 4835 patch_prober.go:28] interesting pod/router-default-5444994796-4fnq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 18:16:42 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 03 18:16:42 crc kubenswrapper[4835]: [+]process-running ok Oct 03 18:16:42 crc kubenswrapper[4835]: healthz check failed Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.642902 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4fnq6" podUID="a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.647401 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" event={"ID":"810a8fd3-d63d-4fd1-b6f1-186457e8878a","Type":"ContainerStarted","Data":"b4d11948c331929a29b91df54b914305217cae1238df57d1e4867522893f581c"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.647609 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.652463 4835 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-p7wfs container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.652500 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" podUID="810a8fd3-d63d-4fd1-b6f1-186457e8878a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.652692 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lwndq" podStartSLOduration=124.652672946 podStartE2EDuration="2m4.652672946s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.585548802 +0000 UTC m=+144.301489674" watchObservedRunningTime="2025-10-03 18:16:42.652672946 +0000 UTC m=+144.368613808" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.652980 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-v7gmf" podStartSLOduration=124.652976104 podStartE2EDuration="2m4.652976104s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.648005083 +0000 UTC m=+144.363945955" watchObservedRunningTime="2025-10-03 18:16:42.652976104 +0000 UTC m=+144.368916966" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.682740 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" event={"ID":"6818d850-0c23-481b-b3f5-fbb31275d97f","Type":"ContainerStarted","Data":"3078d612e85d68749cc69955c879ce82cc5ff9c4a5da27ceb32f92b719417091"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.682789 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" event={"ID":"6818d850-0c23-481b-b3f5-fbb31275d97f","Type":"ContainerStarted","Data":"ae68ca3c1c972b7e78dabcd3d3cba3c3374507705fb8a959d86e3aa45bfdb627"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.683318 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.688352 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-4fnq6" podStartSLOduration=124.688335949 podStartE2EDuration="2m4.688335949s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.688242717 +0000 UTC m=+144.404183589" watchObservedRunningTime="2025-10-03 18:16:42.688335949 +0000 UTC m=+144.404276821" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.690263 4835 generic.go:334] "Generic (PLEG): container finished" podID="6dbbcec6-d076-48dc-8d4d-93668ce8f2a1" containerID="cf3e3db525529ba227dd04663dfaf86e1ef43895f15669c3e221d2be7a0b88db" exitCode=0 Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.690323 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" event={"ID":"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1","Type":"ContainerDied","Data":"cf3e3db525529ba227dd04663dfaf86e1ef43895f15669c3e221d2be7a0b88db"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.698289 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pf9vb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.698348 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" podUID="6818d850-0c23-481b-b3f5-fbb31275d97f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.700352 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lf89q" event={"ID":"64724db7-de23-427c-a555-585e9c0b7173","Type":"ContainerStarted","Data":"ce062d3cca8f9c0c9b3f5232b52c668222def5af4925a2ab660ac1bc082b3c0a"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.700382 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lf89q" event={"ID":"64724db7-de23-427c-a555-585e9c0b7173","Type":"ContainerStarted","Data":"3b3e3b33a37a28f987d17ae5b168ba1595f1ee50f00dfc458df2ce361459e53b"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.735629 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:42 crc kubenswrapper[4835]: E1003 18:16:42.736011 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:43.235995678 +0000 UTC m=+144.951936550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.736929 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" podStartSLOduration=124.736915272 podStartE2EDuration="2m4.736915272s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.735975648 +0000 UTC m=+144.451916520" watchObservedRunningTime="2025-10-03 18:16:42.736915272 +0000 UTC m=+144.452856144" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.767826 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-677dr" podStartSLOduration=123.767810798 podStartE2EDuration="2m3.767810798s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.767276525 +0000 UTC m=+144.483217397" watchObservedRunningTime="2025-10-03 18:16:42.767810798 +0000 UTC m=+144.483751670" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.780143 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" event={"ID":"321f8504-187a-46a9-b5bc-a27b93175e39","Type":"ContainerStarted","Data":"6eb25b25e389d22ae1054a4cfe78e9a26b4960bbe9d9baf54c7c7dda5da5b767"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.780817 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.821965 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" event={"ID":"9da20332-c5f6-49ae-af6e-cabb1e166cf5","Type":"ContainerStarted","Data":"bbc0e17703347c08576c2fd11246c342e089bb714fe28af71a8d09de15efd30e"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.822008 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" event={"ID":"9da20332-c5f6-49ae-af6e-cabb1e166cf5","Type":"ContainerStarted","Data":"5f46bab9ebca2fb953137ebc73cef10dbefd86f5b34fbdd14d6894ed4c531835"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.830347 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" event={"ID":"2cc40a88-90de-40be-b285-4b7f8bd11709","Type":"ContainerStarted","Data":"8d400f1d6bdbfef7a685fc7966b1c70ca582f6917c0233ca25f48644912b0be1"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.843756 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:42 crc kubenswrapper[4835]: E1003 18:16:42.848841 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:43.348827039 +0000 UTC m=+145.064767911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.861772 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" event={"ID":"24baba3b-d1f1-426a-88a9-9bd5cb44112d","Type":"ContainerStarted","Data":"57de4186ec674fd8538b97855d8aa3a2510dba747629ed07c09eef558af30c49"} Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.888454 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.900035 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lf89q" podStartSLOduration=6.900021893 podStartE2EDuration="6.900021893s" podCreationTimestamp="2025-10-03 18:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.857434928 +0000 UTC m=+144.573375800" watchObservedRunningTime="2025-10-03 18:16:42.900021893 +0000 UTC m=+144.615962765" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.901596 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" podStartSLOduration=123.901591024 podStartE2EDuration="2m3.901591024s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.89880797 +0000 UTC m=+144.614748842" watchObservedRunningTime="2025-10-03 18:16:42.901591024 +0000 UTC m=+144.617531896" Oct 03 18:16:42 crc kubenswrapper[4835]: I1003 18:16:42.948558 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:42 crc kubenswrapper[4835]: E1003 18:16:42.948898 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:43.448873354 +0000 UTC m=+145.164814276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.059852 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.061327 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:43.561309114 +0000 UTC m=+145.277250076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.075838 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" podStartSLOduration=124.075822878 podStartE2EDuration="2m4.075822878s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:42.948361559 +0000 UTC m=+144.664302431" watchObservedRunningTime="2025-10-03 18:16:43.075822878 +0000 UTC m=+144.791763740" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.161657 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.163573 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:43.663548086 +0000 UTC m=+145.379488958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.166742 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.167305 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:43.667291495 +0000 UTC m=+145.383232367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.268564 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.268867 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:43.768852678 +0000 UTC m=+145.484793550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.271507 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mc5r2" podStartSLOduration=124.271487798 podStartE2EDuration="2m4.271487798s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:43.085160145 +0000 UTC m=+144.801101017" watchObservedRunningTime="2025-10-03 18:16:43.271487798 +0000 UTC m=+144.987428670" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.370352 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.370803 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:43.870785402 +0000 UTC m=+145.586726274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.389949 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-gnrjg" podStartSLOduration=125.389928008 podStartE2EDuration="2m5.389928008s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:43.363368317 +0000 UTC m=+145.079309189" watchObservedRunningTime="2025-10-03 18:16:43.389928008 +0000 UTC m=+145.105868880" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.471262 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.471854 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:43.971823372 +0000 UTC m=+145.687764244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.572520 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.572826 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.07281484 +0000 UTC m=+145.788755712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.624146 4835 patch_prober.go:28] interesting pod/router-default-5444994796-4fnq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 18:16:43 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 03 18:16:43 crc kubenswrapper[4835]: [+]process-running ok Oct 03 18:16:43 crc kubenswrapper[4835]: healthz check failed Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.624204 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4fnq6" podUID="a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.673769 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.673913 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.173888292 +0000 UTC m=+145.889829164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.674016 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.674412 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.174403575 +0000 UTC m=+145.890344437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.774961 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.775163 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.275137027 +0000 UTC m=+145.991077899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.775335 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.775631 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.27561998 +0000 UTC m=+145.991560852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.868705 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" event={"ID":"59865c39-3a5f-44cf-b6a8-ce0552bc8d0b","Type":"ContainerStarted","Data":"e9a694431b17bacd87a3408dcce6e9bd37a5677822107684f3742d10d5a846eb"} Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.869102 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.871335 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-26hp9" event={"ID":"ba033d0b-fb5e-4f72-8ac4-31ade9c07a25","Type":"ContainerStarted","Data":"68ecff6ced3ae52051a8d36feeab696a91771e7a8fe1a33e224b6ae481437b39"} Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.871419 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-26hp9" event={"ID":"ba033d0b-fb5e-4f72-8ac4-31ade9c07a25","Type":"ContainerStarted","Data":"e72f882439114eeab03da7450f3ff81e6c68a453eb2d8b603cafe9bd399ba345"} Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.871501 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-26hp9" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.873165 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" event={"ID":"321f8504-187a-46a9-b5bc-a27b93175e39","Type":"ContainerStarted","Data":"119592f62379bce35b4b4a752eb1f7f1dce3fb64d5aec8d93d4c57d41e63ede1"} Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.873209 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" event={"ID":"321f8504-187a-46a9-b5bc-a27b93175e39","Type":"ContainerStarted","Data":"0b7a7aae3cc26d269f3abd80544576bdee013387d293bcd9cca08ceae9a3c0b6"} Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.875126 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" event={"ID":"9da20332-c5f6-49ae-af6e-cabb1e166cf5","Type":"ContainerStarted","Data":"e161084a7360fb36800cc62c0de23588ab813434ee5c2973882b93168d8cb0a0"} Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.875679 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.875857 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.375833268 +0000 UTC m=+146.091774230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.876123 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.876437 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.376428864 +0000 UTC m=+146.092369736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.878024 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fqtzs" event={"ID":"078d8acd-04ed-453e-a67b-39efe6ea5bf9","Type":"ContainerStarted","Data":"08b98d5fd83c9e5623f1055ccd8de2dcb51f88ed2c2a9a9f6d43556fa239f108"} Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.878088 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fqtzs" event={"ID":"078d8acd-04ed-453e-a67b-39efe6ea5bf9","Type":"ContainerStarted","Data":"38a28775a02c91fa5b19afd9c37a8db305bdc066d631c93f4fcedcb9b324112f"} Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.881041 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" event={"ID":"f3ea7745-8aa0-4bcd-86e4-326313d026bd","Type":"ContainerStarted","Data":"6944810b35d08c5b218b5e8cbba8858761c1393977d36582b41d1bd3165d106b"} Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.884207 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" event={"ID":"6dbbcec6-d076-48dc-8d4d-93668ce8f2a1","Type":"ContainerStarted","Data":"bb96afa0781ad98f25b44090bea4b71f6dbed5e02077ab0877b973776adec85a"} Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.885850 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" event={"ID":"084d1738-c9cb-4d72-98e1-dbbd06e4b084","Type":"ContainerStarted","Data":"c3aaebb2511b1f9957cea6b500dd577f9123ae5cd2a9db1c0d184c46505d7f94"} Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.887464 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" event={"ID":"9ceb6f23-7bb6-4eca-92b5-13d8c1e4a2a6","Type":"ContainerStarted","Data":"a4682ba9c849d5b7a40df696b4e8b7cc5578161808903fc7c010597f9948d9a4"} Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.889639 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.890540 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" event={"ID":"8cb17f76-674e-4cf7-8f87-9af6942bc5c3","Type":"ContainerStarted","Data":"943fb918381beb17520dd52bec8c1647113e869e66f2c357d45710ecbb9b300f"} Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.893338 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pf9vb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.893403 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" podUID="6818d850-0c23-481b-b3f5-fbb31275d97f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.893457 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcjr4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.893499 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mcjr4" podUID="efa22703-0a66-407f-9f2e-333bac190ce8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.900663 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-b5c8p" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.907090 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z6xvv" podStartSLOduration=124.907058493 podStartE2EDuration="2m4.907058493s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:43.392419644 +0000 UTC m=+145.108360516" watchObservedRunningTime="2025-10-03 18:16:43.907058493 +0000 UTC m=+145.622999355" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.907175 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-h6c75" podStartSLOduration=124.907170766 podStartE2EDuration="2m4.907170766s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:43.906461358 +0000 UTC m=+145.622402230" watchObservedRunningTime="2025-10-03 18:16:43.907170766 +0000 UTC m=+145.623111638" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.959442 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-26hp9" podStartSLOduration=7.959420507 podStartE2EDuration="7.959420507s" podCreationTimestamp="2025-10-03 18:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:43.957450995 +0000 UTC m=+145.673391867" watchObservedRunningTime="2025-10-03 18:16:43.959420507 +0000 UTC m=+145.675361379" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.974685 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.977491 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.977664 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.477638599 +0000 UTC m=+146.193579471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:43 crc kubenswrapper[4835]: I1003 18:16:43.978042 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:43 crc kubenswrapper[4835]: E1003 18:16:43.980037 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.480028691 +0000 UTC m=+146.195969563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.007790 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" podStartSLOduration=126.007767784 podStartE2EDuration="2m6.007767784s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:44.0045838 +0000 UTC m=+145.720524692" watchObservedRunningTime="2025-10-03 18:16:44.007767784 +0000 UTC m=+145.723708656" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.074433 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.074771 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.076568 4835 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-n8dm7 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.17:8443/livez\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.076615 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" podUID="6dbbcec6-d076-48dc-8d4d-93668ce8f2a1" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.17:8443/livez\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.079699 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:44 crc kubenswrapper[4835]: E1003 18:16:44.079958 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.579943102 +0000 UTC m=+146.295883974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.094940 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-fqtzs" podStartSLOduration=125.094907267 podStartE2EDuration="2m5.094907267s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:44.093010537 +0000 UTC m=+145.808951409" watchObservedRunningTime="2025-10-03 18:16:44.094907267 +0000 UTC m=+145.810848139" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.182529 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:44 crc kubenswrapper[4835]: E1003 18:16:44.182765 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.682754568 +0000 UTC m=+146.398695440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.218895 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" podStartSLOduration=104.218881193 podStartE2EDuration="1m44.218881193s" podCreationTimestamp="2025-10-03 18:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:44.177423277 +0000 UTC m=+145.893364149" watchObservedRunningTime="2025-10-03 18:16:44.218881193 +0000 UTC m=+145.934822065" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.219696 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" podStartSLOduration=125.219691254 podStartE2EDuration="2m5.219691254s" podCreationTimestamp="2025-10-03 18:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:44.213447079 +0000 UTC m=+145.929387951" watchObservedRunningTime="2025-10-03 18:16:44.219691254 +0000 UTC m=+145.935632126" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.228177 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ndfwz"] Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.229225 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.231276 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.251770 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ndfwz"] Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.287104 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:44 crc kubenswrapper[4835]: E1003 18:16:44.287823 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.787797014 +0000 UTC m=+146.503737886 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.394265 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s2nx\" (UniqueName: \"kubernetes.io/projected/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-kube-api-access-4s2nx\") pod \"certified-operators-ndfwz\" (UID: \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\") " pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.394347 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-utilities\") pod \"certified-operators-ndfwz\" (UID: \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\") " pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.394392 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.394428 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-catalog-content\") pod \"certified-operators-ndfwz\" (UID: \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\") " pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:16:44 crc kubenswrapper[4835]: E1003 18:16:44.394886 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.894870544 +0000 UTC m=+146.610811416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.396603 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-l27kl" podStartSLOduration=126.396581889 podStartE2EDuration="2m6.396581889s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:44.265504185 +0000 UTC m=+145.981445057" watchObservedRunningTime="2025-10-03 18:16:44.396581889 +0000 UTC m=+146.112522761" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.399185 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xjj46"] Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.400145 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.431575 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.435916 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xjj46"] Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.495675 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:44 crc kubenswrapper[4835]: E1003 18:16:44.495809 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.99578043 +0000 UTC m=+146.711721302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.495975 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s2nx\" (UniqueName: \"kubernetes.io/projected/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-kube-api-access-4s2nx\") pod \"certified-operators-ndfwz\" (UID: \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\") " pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.496027 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b246bf3-c3d8-41d9-9ae1-660fdc057961-utilities\") pod \"community-operators-xjj46\" (UID: \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\") " pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.496053 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-utilities\") pod \"certified-operators-ndfwz\" (UID: \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\") " pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.496109 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b246bf3-c3d8-41d9-9ae1-660fdc057961-catalog-content\") pod \"community-operators-xjj46\" (UID: \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\") " pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.496143 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.496177 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-catalog-content\") pod \"certified-operators-ndfwz\" (UID: \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\") " pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.496205 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t8sj\" (UniqueName: \"kubernetes.io/projected/9b246bf3-c3d8-41d9-9ae1-660fdc057961-kube-api-access-4t8sj\") pod \"community-operators-xjj46\" (UID: \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\") " pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:16:44 crc kubenswrapper[4835]: E1003 18:16:44.496553 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:44.99654274 +0000 UTC m=+146.712483602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.497422 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-utilities\") pod \"certified-operators-ndfwz\" (UID: \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\") " pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.497441 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-catalog-content\") pod \"certified-operators-ndfwz\" (UID: \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\") " pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.550447 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s2nx\" (UniqueName: \"kubernetes.io/projected/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-kube-api-access-4s2nx\") pod \"certified-operators-ndfwz\" (UID: \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\") " pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.560802 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.580546 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jxp2s" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.600701 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.601128 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t8sj\" (UniqueName: \"kubernetes.io/projected/9b246bf3-c3d8-41d9-9ae1-660fdc057961-kube-api-access-4t8sj\") pod \"community-operators-xjj46\" (UID: \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\") " pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.601214 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b246bf3-c3d8-41d9-9ae1-660fdc057961-utilities\") pod \"community-operators-xjj46\" (UID: \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\") " pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.601246 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b246bf3-c3d8-41d9-9ae1-660fdc057961-catalog-content\") pod \"community-operators-xjj46\" (UID: \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\") " pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:16:44 crc kubenswrapper[4835]: E1003 18:16:44.601313 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.101275868 +0000 UTC m=+146.817216740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.602060 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b246bf3-c3d8-41d9-9ae1-660fdc057961-catalog-content\") pod \"community-operators-xjj46\" (UID: \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\") " pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.602181 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b246bf3-c3d8-41d9-9ae1-660fdc057961-utilities\") pod \"community-operators-xjj46\" (UID: \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\") " pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.627198 4835 patch_prober.go:28] interesting pod/router-default-5444994796-4fnq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 18:16:44 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 03 18:16:44 crc kubenswrapper[4835]: [+]process-running ok Oct 03 18:16:44 crc kubenswrapper[4835]: healthz check failed Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.627565 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4fnq6" podUID="a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.676005 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t8sj\" (UniqueName: \"kubernetes.io/projected/9b246bf3-c3d8-41d9-9ae1-660fdc057961-kube-api-access-4t8sj\") pod \"community-operators-xjj46\" (UID: \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\") " pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.681217 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2dsfn"] Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.688109 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.699624 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2dsfn"] Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.707642 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n7lz\" (UniqueName: \"kubernetes.io/projected/84574502-e2af-4d0b-83e9-206513b44cbc-kube-api-access-2n7lz\") pod \"certified-operators-2dsfn\" (UID: \"84574502-e2af-4d0b-83e9-206513b44cbc\") " pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.707722 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84574502-e2af-4d0b-83e9-206513b44cbc-utilities\") pod \"certified-operators-2dsfn\" (UID: \"84574502-e2af-4d0b-83e9-206513b44cbc\") " pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.707745 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84574502-e2af-4d0b-83e9-206513b44cbc-catalog-content\") pod \"certified-operators-2dsfn\" (UID: \"84574502-e2af-4d0b-83e9-206513b44cbc\") " pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.707774 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:44 crc kubenswrapper[4835]: E1003 18:16:44.708048 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.208036489 +0000 UTC m=+146.923977361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.725406 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.811278 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:44 crc kubenswrapper[4835]: E1003 18:16:44.811540 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.311524173 +0000 UTC m=+147.027465045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.811587 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84574502-e2af-4d0b-83e9-206513b44cbc-catalog-content\") pod \"certified-operators-2dsfn\" (UID: \"84574502-e2af-4d0b-83e9-206513b44cbc\") " pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.811627 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.811677 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n7lz\" (UniqueName: \"kubernetes.io/projected/84574502-e2af-4d0b-83e9-206513b44cbc-kube-api-access-2n7lz\") pod \"certified-operators-2dsfn\" (UID: \"84574502-e2af-4d0b-83e9-206513b44cbc\") " pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.811733 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84574502-e2af-4d0b-83e9-206513b44cbc-utilities\") pod \"certified-operators-2dsfn\" (UID: \"84574502-e2af-4d0b-83e9-206513b44cbc\") " pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.812045 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84574502-e2af-4d0b-83e9-206513b44cbc-catalog-content\") pod \"certified-operators-2dsfn\" (UID: \"84574502-e2af-4d0b-83e9-206513b44cbc\") " pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.812114 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84574502-e2af-4d0b-83e9-206513b44cbc-utilities\") pod \"certified-operators-2dsfn\" (UID: \"84574502-e2af-4d0b-83e9-206513b44cbc\") " pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.812287 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-frjq8"] Oct 03 18:16:44 crc kubenswrapper[4835]: E1003 18:16:44.812300 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.312288514 +0000 UTC m=+147.028229376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.814011 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.837028 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-frjq8"] Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.849884 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n7lz\" (UniqueName: \"kubernetes.io/projected/84574502-e2af-4d0b-83e9-206513b44cbc-kube-api-access-2n7lz\") pod \"certified-operators-2dsfn\" (UID: \"84574502-e2af-4d0b-83e9-206513b44cbc\") " pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.862937 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-q6jkk" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.912987 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.913146 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb64305-2843-4f0d-89d9-15c6a2be9353-utilities\") pod \"community-operators-frjq8\" (UID: \"9fb64305-2843-4f0d-89d9-15c6a2be9353\") " pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.913187 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb64305-2843-4f0d-89d9-15c6a2be9353-catalog-content\") pod \"community-operators-frjq8\" (UID: \"9fb64305-2843-4f0d-89d9-15c6a2be9353\") " pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.913217 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-859kj\" (UniqueName: \"kubernetes.io/projected/9fb64305-2843-4f0d-89d9-15c6a2be9353-kube-api-access-859kj\") pod \"community-operators-frjq8\" (UID: \"9fb64305-2843-4f0d-89d9-15c6a2be9353\") " pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:16:44 crc kubenswrapper[4835]: E1003 18:16:44.913344 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.413329144 +0000 UTC m=+147.129270006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.950956 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" event={"ID":"084d1738-c9cb-4d72-98e1-dbbd06e4b084","Type":"ContainerStarted","Data":"e73d23ad6f7ddfd4fbc9bf6b29997eb7c832627274ab1540914075994d94a8c0"} Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.951006 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" event={"ID":"084d1738-c9cb-4d72-98e1-dbbd06e4b084","Type":"ContainerStarted","Data":"50b35663cd29e77e3b028099e6e0925a646b330534cf465fac2f31ca2cbb897f"} Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.965841 4835 generic.go:334] "Generic (PLEG): container finished" podID="8cb17f76-674e-4cf7-8f87-9af6942bc5c3" containerID="943fb918381beb17520dd52bec8c1647113e869e66f2c357d45710ecbb9b300f" exitCode=0 Oct 03 18:16:44 crc kubenswrapper[4835]: I1003 18:16:44.967175 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" event={"ID":"8cb17f76-674e-4cf7-8f87-9af6942bc5c3","Type":"ContainerDied","Data":"943fb918381beb17520dd52bec8c1647113e869e66f2c357d45710ecbb9b300f"} Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.019110 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb64305-2843-4f0d-89d9-15c6a2be9353-utilities\") pod \"community-operators-frjq8\" (UID: \"9fb64305-2843-4f0d-89d9-15c6a2be9353\") " pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.019354 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.019388 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb64305-2843-4f0d-89d9-15c6a2be9353-catalog-content\") pod \"community-operators-frjq8\" (UID: \"9fb64305-2843-4f0d-89d9-15c6a2be9353\") " pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.019475 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-859kj\" (UniqueName: \"kubernetes.io/projected/9fb64305-2843-4f0d-89d9-15c6a2be9353-kube-api-access-859kj\") pod \"community-operators-frjq8\" (UID: \"9fb64305-2843-4f0d-89d9-15c6a2be9353\") " pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.022965 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb64305-2843-4f0d-89d9-15c6a2be9353-utilities\") pod \"community-operators-frjq8\" (UID: \"9fb64305-2843-4f0d-89d9-15c6a2be9353\") " pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:16:45 crc kubenswrapper[4835]: E1003 18:16:45.025314 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.525296823 +0000 UTC m=+147.241237695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.034668 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb64305-2843-4f0d-89d9-15c6a2be9353-catalog-content\") pod \"community-operators-frjq8\" (UID: \"9fb64305-2843-4f0d-89d9-15c6a2be9353\") " pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.064408 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.066140 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-859kj\" (UniqueName: \"kubernetes.io/projected/9fb64305-2843-4f0d-89d9-15c6a2be9353-kube-api-access-859kj\") pod \"community-operators-frjq8\" (UID: \"9fb64305-2843-4f0d-89d9-15c6a2be9353\") " pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.122926 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:45 crc kubenswrapper[4835]: E1003 18:16:45.123151 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.623135598 +0000 UTC m=+147.339076470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.123409 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:45 crc kubenswrapper[4835]: E1003 18:16:45.123675 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.623665992 +0000 UTC m=+147.339606864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.134342 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.226572 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:45 crc kubenswrapper[4835]: E1003 18:16:45.227149 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.727134486 +0000 UTC m=+147.443075358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.308422 4835 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.336333 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:45 crc kubenswrapper[4835]: E1003 18:16:45.336681 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.83666974 +0000 UTC m=+147.552610612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.437270 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:45 crc kubenswrapper[4835]: E1003 18:16:45.437698 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:45.937684429 +0000 UTC m=+147.653625301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.536945 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ndfwz"] Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.538568 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:45 crc kubenswrapper[4835]: E1003 18:16:45.538956 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:46.038941026 +0000 UTC m=+147.754881898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.623406 4835 patch_prober.go:28] interesting pod/router-default-5444994796-4fnq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 18:16:45 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 03 18:16:45 crc kubenswrapper[4835]: [+]process-running ok Oct 03 18:16:45 crc kubenswrapper[4835]: healthz check failed Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.623460 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4fnq6" podUID="a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.641393 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:45 crc kubenswrapper[4835]: E1003 18:16:45.641893 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:46.141878495 +0000 UTC m=+147.857819357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.680574 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2dsfn"] Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.682929 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-frjq8"] Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.706671 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xjj46"] Oct 03 18:16:45 crc kubenswrapper[4835]: W1003 18:16:45.737971 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b246bf3_c3d8_41d9_9ae1_660fdc057961.slice/crio-d27d82e101ea9483228f5b4e8e3cf6c6d7211cc16b52b1b77eef286f26c81b77 WatchSource:0}: Error finding container d27d82e101ea9483228f5b4e8e3cf6c6d7211cc16b52b1b77eef286f26c81b77: Status 404 returned error can't find the container with id d27d82e101ea9483228f5b4e8e3cf6c6d7211cc16b52b1b77eef286f26c81b77 Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.743878 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:45 crc kubenswrapper[4835]: E1003 18:16:45.744218 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:46.244184089 +0000 UTC m=+147.960124961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.845396 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:45 crc kubenswrapper[4835]: E1003 18:16:45.845508 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:46.345483436 +0000 UTC m=+148.061424308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.845938 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.846133 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.846276 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.846374 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.846469 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:45 crc kubenswrapper[4835]: E1003 18:16:45.846851 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:46.346820651 +0000 UTC m=+148.062761523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.847708 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.862944 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.863536 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.868239 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.887377 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.949534 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:45 crc kubenswrapper[4835]: E1003 18:16:45.949877 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 18:16:46.449862484 +0000 UTC m=+148.165803356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.988236 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dsfn" event={"ID":"84574502-e2af-4d0b-83e9-206513b44cbc","Type":"ContainerStarted","Data":"ba4a3876aa2746b79f30fd4b4a95404ae47827f9a378f431fa6e66d3e945817f"} Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.991537 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frjq8" event={"ID":"9fb64305-2843-4f0d-89d9-15c6a2be9353","Type":"ContainerStarted","Data":"cdf2437c402de25c396f95f6378e8257b6d7e64b9e3358a0be17ca7699abc99a"} Oct 03 18:16:45 crc kubenswrapper[4835]: I1003 18:16:45.991566 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frjq8" event={"ID":"9fb64305-2843-4f0d-89d9-15c6a2be9353","Type":"ContainerStarted","Data":"de61510676af3421c2ff1b69ac28af89e98b47ad0497553f6ad3832461dc60df"} Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.021620 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjj46" event={"ID":"9b246bf3-c3d8-41d9-9ae1-660fdc057961","Type":"ContainerStarted","Data":"40ca00b28223b3725170500ec9be054191e082f53f3c428d156bf5efc391899a"} Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.021663 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjj46" event={"ID":"9b246bf3-c3d8-41d9-9ae1-660fdc057961","Type":"ContainerStarted","Data":"d27d82e101ea9483228f5b4e8e3cf6c6d7211cc16b52b1b77eef286f26c81b77"} Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.026650 4835 generic.go:334] "Generic (PLEG): container finished" podID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" containerID="fb06301098b71f3f3497d75c607b9be277f6e757c92aec7536c9db6272a24ac0" exitCode=0 Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.026911 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndfwz" event={"ID":"fa6d5cf8-dfde-42f7-9507-48f41bf44b50","Type":"ContainerDied","Data":"fb06301098b71f3f3497d75c607b9be277f6e757c92aec7536c9db6272a24ac0"} Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.027129 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndfwz" event={"ID":"fa6d5cf8-dfde-42f7-9507-48f41bf44b50","Type":"ContainerStarted","Data":"eb0fcc4293f2211d002cb1afd001adfece9e9293b30d92a5f32ae3c84ed5ea64"} Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.028305 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.029847 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" event={"ID":"084d1738-c9cb-4d72-98e1-dbbd06e4b084","Type":"ContainerStarted","Data":"809b56204a641442a5ba0c51bf012ad9a99a0ecb0976e5d8716c5d234dac451d"} Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.050855 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:46 crc kubenswrapper[4835]: E1003 18:16:46.051640 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 18:16:46.551619483 +0000 UTC m=+148.267560355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqldm" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.065143 4835 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-03T18:16:45.308448075Z","Handler":null,"Name":""} Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.089219 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.097002 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.108295 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-bv6gz" podStartSLOduration=10.10827706 podStartE2EDuration="10.10827706s" podCreationTimestamp="2025-10-03 18:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:46.106058812 +0000 UTC m=+147.821999684" watchObservedRunningTime="2025-10-03 18:16:46.10827706 +0000 UTC m=+147.824217922" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.129862 4835 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.129893 4835 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.152169 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.158985 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.185510 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.186661 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.193192 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.193403 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.195329 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.256554 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01e04886-c94d-4494-be1f-9a2aa17adf40-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"01e04886-c94d-4494-be1f-9a2aa17adf40\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.256647 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.256675 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01e04886-c94d-4494-be1f-9a2aa17adf40-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"01e04886-c94d-4494-be1f-9a2aa17adf40\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.274231 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.349441 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.349510 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.357797 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98b9r\" (UniqueName: \"kubernetes.io/projected/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-kube-api-access-98b9r\") pod \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\" (UID: \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\") " Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.357895 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-secret-volume\") pod \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\" (UID: \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\") " Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.357938 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-config-volume\") pod \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\" (UID: \"8cb17f76-674e-4cf7-8f87-9af6942bc5c3\") " Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.358241 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01e04886-c94d-4494-be1f-9a2aa17adf40-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"01e04886-c94d-4494-be1f-9a2aa17adf40\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.358319 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01e04886-c94d-4494-be1f-9a2aa17adf40-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"01e04886-c94d-4494-be1f-9a2aa17adf40\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.359868 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-config-volume" (OuterVolumeSpecName: "config-volume") pod "8cb17f76-674e-4cf7-8f87-9af6942bc5c3" (UID: "8cb17f76-674e-4cf7-8f87-9af6942bc5c3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.359922 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01e04886-c94d-4494-be1f-9a2aa17adf40-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"01e04886-c94d-4494-be1f-9a2aa17adf40\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.364514 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-kube-api-access-98b9r" (OuterVolumeSpecName: "kube-api-access-98b9r") pod "8cb17f76-674e-4cf7-8f87-9af6942bc5c3" (UID: "8cb17f76-674e-4cf7-8f87-9af6942bc5c3"). InnerVolumeSpecName "kube-api-access-98b9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.368393 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8cb17f76-674e-4cf7-8f87-9af6942bc5c3" (UID: "8cb17f76-674e-4cf7-8f87-9af6942bc5c3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.378689 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01e04886-c94d-4494-be1f-9a2aa17adf40-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"01e04886-c94d-4494-be1f-9a2aa17adf40\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.385835 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqldm\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.386425 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zrf6l"] Oct 03 18:16:46 crc kubenswrapper[4835]: E1003 18:16:46.388544 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb17f76-674e-4cf7-8f87-9af6942bc5c3" containerName="collect-profiles" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.388565 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb17f76-674e-4cf7-8f87-9af6942bc5c3" containerName="collect-profiles" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.388671 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb17f76-674e-4cf7-8f87-9af6942bc5c3" containerName="collect-profiles" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.389378 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.396611 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrf6l"] Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.397183 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.459352 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6d7146-dd06-4086-8b38-2140c5deeff9-utilities\") pod \"redhat-marketplace-zrf6l\" (UID: \"ed6d7146-dd06-4086-8b38-2140c5deeff9\") " pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.459495 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6d7146-dd06-4086-8b38-2140c5deeff9-catalog-content\") pod \"redhat-marketplace-zrf6l\" (UID: \"ed6d7146-dd06-4086-8b38-2140c5deeff9\") " pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.459598 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mhtv\" (UniqueName: \"kubernetes.io/projected/ed6d7146-dd06-4086-8b38-2140c5deeff9-kube-api-access-8mhtv\") pod \"redhat-marketplace-zrf6l\" (UID: \"ed6d7146-dd06-4086-8b38-2140c5deeff9\") " pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.459662 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98b9r\" (UniqueName: \"kubernetes.io/projected/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-kube-api-access-98b9r\") on node \"crc\" DevicePath \"\"" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.459678 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.459770 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8cb17f76-674e-4cf7-8f87-9af6942bc5c3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 18:16:46 crc kubenswrapper[4835]: W1003 18:16:46.488440 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-b38e4663d9b6543e07f90f85260e91fcb846d9233bbd44ff7be261ef9ebca765 WatchSource:0}: Error finding container b38e4663d9b6543e07f90f85260e91fcb846d9233bbd44ff7be261ef9ebca765: Status 404 returned error can't find the container with id b38e4663d9b6543e07f90f85260e91fcb846d9233bbd44ff7be261ef9ebca765 Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.490996 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.518866 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.561013 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mhtv\" (UniqueName: \"kubernetes.io/projected/ed6d7146-dd06-4086-8b38-2140c5deeff9-kube-api-access-8mhtv\") pod \"redhat-marketplace-zrf6l\" (UID: \"ed6d7146-dd06-4086-8b38-2140c5deeff9\") " pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.561055 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6d7146-dd06-4086-8b38-2140c5deeff9-utilities\") pod \"redhat-marketplace-zrf6l\" (UID: \"ed6d7146-dd06-4086-8b38-2140c5deeff9\") " pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.561106 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6d7146-dd06-4086-8b38-2140c5deeff9-catalog-content\") pod \"redhat-marketplace-zrf6l\" (UID: \"ed6d7146-dd06-4086-8b38-2140c5deeff9\") " pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.561548 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6d7146-dd06-4086-8b38-2140c5deeff9-catalog-content\") pod \"redhat-marketplace-zrf6l\" (UID: \"ed6d7146-dd06-4086-8b38-2140c5deeff9\") " pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.564564 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6d7146-dd06-4086-8b38-2140c5deeff9-utilities\") pod \"redhat-marketplace-zrf6l\" (UID: \"ed6d7146-dd06-4086-8b38-2140c5deeff9\") " pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.581950 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mhtv\" (UniqueName: \"kubernetes.io/projected/ed6d7146-dd06-4086-8b38-2140c5deeff9-kube-api-access-8mhtv\") pod \"redhat-marketplace-zrf6l\" (UID: \"ed6d7146-dd06-4086-8b38-2140c5deeff9\") " pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.620660 4835 patch_prober.go:28] interesting pod/router-default-5444994796-4fnq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 18:16:46 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 03 18:16:46 crc kubenswrapper[4835]: [+]process-running ok Oct 03 18:16:46 crc kubenswrapper[4835]: healthz check failed Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.620720 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4fnq6" podUID="a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 18:16:46 crc kubenswrapper[4835]: W1003 18:16:46.631594 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-6c01158916cee178539d5152b913349675e568b33e7a6debf33bdc28071526fd WatchSource:0}: Error finding container 6c01158916cee178539d5152b913349675e568b33e7a6debf33bdc28071526fd: Status 404 returned error can't find the container with id 6c01158916cee178539d5152b913349675e568b33e7a6debf33bdc28071526fd Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.720020 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.727301 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqldm"] Oct 03 18:16:46 crc kubenswrapper[4835]: W1003 18:16:46.739562 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13a0a3b4_f158_42a9_bfbb_2776aa6efe75.slice/crio-ee441ca13503c8c8d641bfcc8f526c2995f788925812a967c4e87f0ca06fc8bc WatchSource:0}: Error finding container ee441ca13503c8c8d641bfcc8f526c2995f788925812a967c4e87f0ca06fc8bc: Status 404 returned error can't find the container with id ee441ca13503c8c8d641bfcc8f526c2995f788925812a967c4e87f0ca06fc8bc Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.764877 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.781009 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xjmsb"] Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.782203 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.798130 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjmsb"] Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.864414 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrnws\" (UniqueName: \"kubernetes.io/projected/64185dc5-d857-4370-8a3b-f1e5ca460bc0-kube-api-access-qrnws\") pod \"redhat-marketplace-xjmsb\" (UID: \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\") " pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.864672 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64185dc5-d857-4370-8a3b-f1e5ca460bc0-catalog-content\") pod \"redhat-marketplace-xjmsb\" (UID: \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\") " pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.864742 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64185dc5-d857-4370-8a3b-f1e5ca460bc0-utilities\") pod \"redhat-marketplace-xjmsb\" (UID: \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\") " pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.897280 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.966516 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64185dc5-d857-4370-8a3b-f1e5ca460bc0-utilities\") pod \"redhat-marketplace-xjmsb\" (UID: \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\") " pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.966592 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrnws\" (UniqueName: \"kubernetes.io/projected/64185dc5-d857-4370-8a3b-f1e5ca460bc0-kube-api-access-qrnws\") pod \"redhat-marketplace-xjmsb\" (UID: \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\") " pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.966620 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64185dc5-d857-4370-8a3b-f1e5ca460bc0-catalog-content\") pod \"redhat-marketplace-xjmsb\" (UID: \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\") " pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.967447 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64185dc5-d857-4370-8a3b-f1e5ca460bc0-utilities\") pod \"redhat-marketplace-xjmsb\" (UID: \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\") " pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.967517 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64185dc5-d857-4370-8a3b-f1e5ca460bc0-catalog-content\") pod \"redhat-marketplace-xjmsb\" (UID: \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\") " pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.981598 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrf6l"] Oct 03 18:16:46 crc kubenswrapper[4835]: I1003 18:16:46.987416 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrnws\" (UniqueName: \"kubernetes.io/projected/64185dc5-d857-4370-8a3b-f1e5ca460bc0-kube-api-access-qrnws\") pod \"redhat-marketplace-xjmsb\" (UID: \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\") " pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.046295 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"01e04886-c94d-4494-be1f-9a2aa17adf40","Type":"ContainerStarted","Data":"c103d1505cba8a545ecda33cf43179d7f39787be34f72e1a5faef279e14042c4"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.060870 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5b8a943abaa670cec9c71add5518930a066e244f46834719fcad6269d78fad2f"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.060919 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b38e4663d9b6543e07f90f85260e91fcb846d9233bbd44ff7be261ef9ebca765"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.063767 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ccbfc8cb4e31386d774697fd3035bbfef713cc45a0672d80a6408c0222de1834"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.063871 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"925b0f7702a3e5c438f422791f0ba5b9529ed88cddeb1cbec9d88efe42da38ba"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.069106 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" event={"ID":"13a0a3b4-f158-42a9-bfbb-2776aa6efe75","Type":"ContainerStarted","Data":"88ffba0983de35eb7dac69554c11c6d6eb33b5be9fb755e071662a50665c0e99"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.069226 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" event={"ID":"13a0a3b4-f158-42a9-bfbb-2776aa6efe75","Type":"ContainerStarted","Data":"ee441ca13503c8c8d641bfcc8f526c2995f788925812a967c4e87f0ca06fc8bc"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.069662 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.071912 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrf6l" event={"ID":"ed6d7146-dd06-4086-8b38-2140c5deeff9","Type":"ContainerStarted","Data":"cc95ab232307ac0e8abcf4ad87b4acabacf72c19f355c15b755b54096864984a"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.096001 4835 generic.go:334] "Generic (PLEG): container finished" podID="84574502-e2af-4d0b-83e9-206513b44cbc" containerID="d99274bb05f2541fdd40265c715397c0f020a5e7101e72bfbfdb058862383c23" exitCode=0 Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.096098 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dsfn" event={"ID":"84574502-e2af-4d0b-83e9-206513b44cbc","Type":"ContainerDied","Data":"d99274bb05f2541fdd40265c715397c0f020a5e7101e72bfbfdb058862383c23"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.113870 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" podStartSLOduration=129.113851451 podStartE2EDuration="2m9.113851451s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:16:47.113682437 +0000 UTC m=+148.829623309" watchObservedRunningTime="2025-10-03 18:16:47.113851451 +0000 UTC m=+148.829792323" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.121962 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.122187 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2" event={"ID":"8cb17f76-674e-4cf7-8f87-9af6942bc5c3","Type":"ContainerDied","Data":"8fbe358286054a65ed84e52578e85518c8a89fe8ed68f095049ee56c03c4cc95"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.122225 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fbe358286054a65ed84e52578e85518c8a89fe8ed68f095049ee56c03c4cc95" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.128950 4835 generic.go:334] "Generic (PLEG): container finished" podID="9fb64305-2843-4f0d-89d9-15c6a2be9353" containerID="cdf2437c402de25c396f95f6378e8257b6d7e64b9e3358a0be17ca7699abc99a" exitCode=0 Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.129015 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frjq8" event={"ID":"9fb64305-2843-4f0d-89d9-15c6a2be9353","Type":"ContainerDied","Data":"cdf2437c402de25c396f95f6378e8257b6d7e64b9e3358a0be17ca7699abc99a"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.142720 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.152257 4835 generic.go:334] "Generic (PLEG): container finished" podID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" containerID="40ca00b28223b3725170500ec9be054191e082f53f3c428d156bf5efc391899a" exitCode=0 Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.152338 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjj46" event={"ID":"9b246bf3-c3d8-41d9-9ae1-660fdc057961","Type":"ContainerDied","Data":"40ca00b28223b3725170500ec9be054191e082f53f3c428d156bf5efc391899a"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.157925 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f97dd0911a78e1d86aac2191b01f385052ff99708bf6664b7aab2f825428cb8e"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.157964 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6c01158916cee178539d5152b913349675e568b33e7a6debf33bdc28071526fd"} Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.158300 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.381186 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tr45z"] Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.384732 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.386699 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.397857 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tr45z"] Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.473835 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fc82477-8141-4654-9153-b2a046309e8b-catalog-content\") pod \"redhat-operators-tr45z\" (UID: \"6fc82477-8141-4654-9153-b2a046309e8b\") " pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.474199 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g9nw\" (UniqueName: \"kubernetes.io/projected/6fc82477-8141-4654-9153-b2a046309e8b-kube-api-access-8g9nw\") pod \"redhat-operators-tr45z\" (UID: \"6fc82477-8141-4654-9153-b2a046309e8b\") " pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.474259 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fc82477-8141-4654-9153-b2a046309e8b-utilities\") pod \"redhat-operators-tr45z\" (UID: \"6fc82477-8141-4654-9153-b2a046309e8b\") " pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.575481 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fc82477-8141-4654-9153-b2a046309e8b-catalog-content\") pod \"redhat-operators-tr45z\" (UID: \"6fc82477-8141-4654-9153-b2a046309e8b\") " pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.575568 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g9nw\" (UniqueName: \"kubernetes.io/projected/6fc82477-8141-4654-9153-b2a046309e8b-kube-api-access-8g9nw\") pod \"redhat-operators-tr45z\" (UID: \"6fc82477-8141-4654-9153-b2a046309e8b\") " pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.575642 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fc82477-8141-4654-9153-b2a046309e8b-utilities\") pod \"redhat-operators-tr45z\" (UID: \"6fc82477-8141-4654-9153-b2a046309e8b\") " pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.576353 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fc82477-8141-4654-9153-b2a046309e8b-catalog-content\") pod \"redhat-operators-tr45z\" (UID: \"6fc82477-8141-4654-9153-b2a046309e8b\") " pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.576433 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fc82477-8141-4654-9153-b2a046309e8b-utilities\") pod \"redhat-operators-tr45z\" (UID: \"6fc82477-8141-4654-9153-b2a046309e8b\") " pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.611961 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g9nw\" (UniqueName: \"kubernetes.io/projected/6fc82477-8141-4654-9153-b2a046309e8b-kube-api-access-8g9nw\") pod \"redhat-operators-tr45z\" (UID: \"6fc82477-8141-4654-9153-b2a046309e8b\") " pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.621256 4835 patch_prober.go:28] interesting pod/router-default-5444994796-4fnq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 18:16:47 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 03 18:16:47 crc kubenswrapper[4835]: [+]process-running ok Oct 03 18:16:47 crc kubenswrapper[4835]: healthz check failed Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.621325 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4fnq6" podUID="a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.681422 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjmsb"] Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.703425 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:16:47 crc kubenswrapper[4835]: W1003 18:16:47.705244 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64185dc5_d857_4370_8a3b_f1e5ca460bc0.slice/crio-d0031570191d37210a919a84ba898b8197c3acfc0b1c873d0787eda805c12b21 WatchSource:0}: Error finding container d0031570191d37210a919a84ba898b8197c3acfc0b1c873d0787eda805c12b21: Status 404 returned error can't find the container with id d0031570191d37210a919a84ba898b8197c3acfc0b1c873d0787eda805c12b21 Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.781880 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n8brz"] Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.782802 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.796535 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8brz"] Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.880715 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/801fdb9d-0303-46b9-aa79-8c30cda070eb-catalog-content\") pod \"redhat-operators-n8brz\" (UID: \"801fdb9d-0303-46b9-aa79-8c30cda070eb\") " pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.880819 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjc7g\" (UniqueName: \"kubernetes.io/projected/801fdb9d-0303-46b9-aa79-8c30cda070eb-kube-api-access-tjc7g\") pod \"redhat-operators-n8brz\" (UID: \"801fdb9d-0303-46b9-aa79-8c30cda070eb\") " pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.880861 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/801fdb9d-0303-46b9-aa79-8c30cda070eb-utilities\") pod \"redhat-operators-n8brz\" (UID: \"801fdb9d-0303-46b9-aa79-8c30cda070eb\") " pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.982086 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjc7g\" (UniqueName: \"kubernetes.io/projected/801fdb9d-0303-46b9-aa79-8c30cda070eb-kube-api-access-tjc7g\") pod \"redhat-operators-n8brz\" (UID: \"801fdb9d-0303-46b9-aa79-8c30cda070eb\") " pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.982166 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/801fdb9d-0303-46b9-aa79-8c30cda070eb-utilities\") pod \"redhat-operators-n8brz\" (UID: \"801fdb9d-0303-46b9-aa79-8c30cda070eb\") " pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.982227 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/801fdb9d-0303-46b9-aa79-8c30cda070eb-catalog-content\") pod \"redhat-operators-n8brz\" (UID: \"801fdb9d-0303-46b9-aa79-8c30cda070eb\") " pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.982804 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/801fdb9d-0303-46b9-aa79-8c30cda070eb-utilities\") pod \"redhat-operators-n8brz\" (UID: \"801fdb9d-0303-46b9-aa79-8c30cda070eb\") " pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.984216 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/801fdb9d-0303-46b9-aa79-8c30cda070eb-catalog-content\") pod \"redhat-operators-n8brz\" (UID: \"801fdb9d-0303-46b9-aa79-8c30cda070eb\") " pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:16:47 crc kubenswrapper[4835]: I1003 18:16:47.998939 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjc7g\" (UniqueName: \"kubernetes.io/projected/801fdb9d-0303-46b9-aa79-8c30cda070eb-kube-api-access-tjc7g\") pod \"redhat-operators-n8brz\" (UID: \"801fdb9d-0303-46b9-aa79-8c30cda070eb\") " pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.122997 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.136189 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tr45z"] Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.171879 4835 generic.go:334] "Generic (PLEG): container finished" podID="01e04886-c94d-4494-be1f-9a2aa17adf40" containerID="cd5cd55c23aa0f5193ee0d953b0492a0aacdfeaf8bb034bedf0279e146fb2c40" exitCode=0 Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.171965 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"01e04886-c94d-4494-be1f-9a2aa17adf40","Type":"ContainerDied","Data":"cd5cd55c23aa0f5193ee0d953b0492a0aacdfeaf8bb034bedf0279e146fb2c40"} Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.174399 4835 generic.go:334] "Generic (PLEG): container finished" podID="ed6d7146-dd06-4086-8b38-2140c5deeff9" containerID="d544a375ca5127e76b8115106ca6a918427049fb18bc9690957b666da42f7bbc" exitCode=0 Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.174468 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrf6l" event={"ID":"ed6d7146-dd06-4086-8b38-2140c5deeff9","Type":"ContainerDied","Data":"d544a375ca5127e76b8115106ca6a918427049fb18bc9690957b666da42f7bbc"} Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.181674 4835 generic.go:334] "Generic (PLEG): container finished" podID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" containerID="15533908bc132e910a993ef83532e71105b08548c9cfd28c4fa5559b359e819c" exitCode=0 Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.182351 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjmsb" event={"ID":"64185dc5-d857-4370-8a3b-f1e5ca460bc0","Type":"ContainerDied","Data":"15533908bc132e910a993ef83532e71105b08548c9cfd28c4fa5559b359e819c"} Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.182383 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjmsb" event={"ID":"64185dc5-d857-4370-8a3b-f1e5ca460bc0","Type":"ContainerStarted","Data":"d0031570191d37210a919a84ba898b8197c3acfc0b1c873d0787eda805c12b21"} Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.386933 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8brz"] Oct 03 18:16:48 crc kubenswrapper[4835]: W1003 18:16:48.432091 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod801fdb9d_0303_46b9_aa79_8c30cda070eb.slice/crio-76b5bc562aac1b0caf783a70b9a3a091e993072ec0c4a29f20ad30eeab912a26 WatchSource:0}: Error finding container 76b5bc562aac1b0caf783a70b9a3a091e993072ec0c4a29f20ad30eeab912a26: Status 404 returned error can't find the container with id 76b5bc562aac1b0caf783a70b9a3a091e993072ec0c4a29f20ad30eeab912a26 Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.620700 4835 patch_prober.go:28] interesting pod/router-default-5444994796-4fnq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 18:16:48 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 03 18:16:48 crc kubenswrapper[4835]: [+]process-running ok Oct 03 18:16:48 crc kubenswrapper[4835]: healthz check failed Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.620988 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4fnq6" podUID="a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.757278 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.757333 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:48 crc kubenswrapper[4835]: I1003 18:16:48.769451 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.043853 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.043914 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.046288 4835 patch_prober.go:28] interesting pod/console-f9d7485db-8nwfg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.046347 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8nwfg" podUID="300d2397-b9b1-4f44-9eb2-5757940cc64c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.067518 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcjr4 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.067559 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mcjr4" podUID="efa22703-0a66-407f-9f2e-333bac190ce8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.067568 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-mcjr4 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.067611 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mcjr4" podUID="efa22703-0a66-407f-9f2e-333bac190ce8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.080579 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.085555 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-n8dm7" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.235413 4835 generic.go:334] "Generic (PLEG): container finished" podID="801fdb9d-0303-46b9-aa79-8c30cda070eb" containerID="a5114f0d0fa2cc8cc37df6098e64b8553d7a7b3e4891cc210d3396a3d0050377" exitCode=0 Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.235762 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8brz" event={"ID":"801fdb9d-0303-46b9-aa79-8c30cda070eb","Type":"ContainerDied","Data":"a5114f0d0fa2cc8cc37df6098e64b8553d7a7b3e4891cc210d3396a3d0050377"} Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.235835 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8brz" event={"ID":"801fdb9d-0303-46b9-aa79-8c30cda070eb","Type":"ContainerStarted","Data":"76b5bc562aac1b0caf783a70b9a3a091e993072ec0c4a29f20ad30eeab912a26"} Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.263105 4835 generic.go:334] "Generic (PLEG): container finished" podID="6fc82477-8141-4654-9153-b2a046309e8b" containerID="47366d00d368cbcbdf0ffa673294f1d624d4c08c5df95455638b43ed42dfcfd9" exitCode=0 Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.263997 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr45z" event={"ID":"6fc82477-8141-4654-9153-b2a046309e8b","Type":"ContainerDied","Data":"47366d00d368cbcbdf0ffa673294f1d624d4c08c5df95455638b43ed42dfcfd9"} Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.264028 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr45z" event={"ID":"6fc82477-8141-4654-9153-b2a046309e8b","Type":"ContainerStarted","Data":"dea543a28a44983a3c697558ec2c5d0030dfa15c8bf0857b0b3d8b9ac8140a0d"} Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.269918 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-nh4lt" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.617161 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.621772 4835 patch_prober.go:28] interesting pod/router-default-5444994796-4fnq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 18:16:49 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 03 18:16:49 crc kubenswrapper[4835]: [+]process-running ok Oct 03 18:16:49 crc kubenswrapper[4835]: healthz check failed Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.621824 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4fnq6" podUID="a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.637427 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.639584 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.738050 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01e04886-c94d-4494-be1f-9a2aa17adf40-kube-api-access\") pod \"01e04886-c94d-4494-be1f-9a2aa17adf40\" (UID: \"01e04886-c94d-4494-be1f-9a2aa17adf40\") " Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.738202 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01e04886-c94d-4494-be1f-9a2aa17adf40-kubelet-dir\") pod \"01e04886-c94d-4494-be1f-9a2aa17adf40\" (UID: \"01e04886-c94d-4494-be1f-9a2aa17adf40\") " Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.738355 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01e04886-c94d-4494-be1f-9a2aa17adf40-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "01e04886-c94d-4494-be1f-9a2aa17adf40" (UID: "01e04886-c94d-4494-be1f-9a2aa17adf40"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.738642 4835 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01e04886-c94d-4494-be1f-9a2aa17adf40-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.743304 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e04886-c94d-4494-be1f-9a2aa17adf40-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "01e04886-c94d-4494-be1f-9a2aa17adf40" (UID: "01e04886-c94d-4494-be1f-9a2aa17adf40"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:16:49 crc kubenswrapper[4835]: I1003 18:16:49.839443 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01e04886-c94d-4494-be1f-9a2aa17adf40-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 18:16:50 crc kubenswrapper[4835]: I1003 18:16:50.276639 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"01e04886-c94d-4494-be1f-9a2aa17adf40","Type":"ContainerDied","Data":"c103d1505cba8a545ecda33cf43179d7f39787be34f72e1a5faef279e14042c4"} Oct 03 18:16:50 crc kubenswrapper[4835]: I1003 18:16:50.276692 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c103d1505cba8a545ecda33cf43179d7f39787be34f72e1a5faef279e14042c4" Oct 03 18:16:50 crc kubenswrapper[4835]: I1003 18:16:50.276763 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 18:16:50 crc kubenswrapper[4835]: I1003 18:16:50.620853 4835 patch_prober.go:28] interesting pod/router-default-5444994796-4fnq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 18:16:50 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 03 18:16:50 crc kubenswrapper[4835]: [+]process-running ok Oct 03 18:16:50 crc kubenswrapper[4835]: healthz check failed Oct 03 18:16:50 crc kubenswrapper[4835]: I1003 18:16:50.620913 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4fnq6" podUID="a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 18:16:51 crc kubenswrapper[4835]: I1003 18:16:51.620233 4835 patch_prober.go:28] interesting pod/router-default-5444994796-4fnq6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 18:16:51 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Oct 03 18:16:51 crc kubenswrapper[4835]: [+]process-running ok Oct 03 18:16:51 crc kubenswrapper[4835]: healthz check failed Oct 03 18:16:51 crc kubenswrapper[4835]: I1003 18:16:51.620295 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-4fnq6" podUID="a5395bc7-4ea8-4863-95d5-cd8ef1cc8b85" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.054615 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 18:16:52 crc kubenswrapper[4835]: E1003 18:16:52.055109 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e04886-c94d-4494-be1f-9a2aa17adf40" containerName="pruner" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.055121 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e04886-c94d-4494-be1f-9a2aa17adf40" containerName="pruner" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.055224 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e04886-c94d-4494-be1f-9a2aa17adf40" containerName="pruner" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.055567 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.059684 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.063905 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.064590 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.090154 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9880164-05de-4884-81a8-c2607d0865ed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a9880164-05de-4884-81a8-c2607d0865ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.090240 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9880164-05de-4884-81a8-c2607d0865ed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a9880164-05de-4884-81a8-c2607d0865ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.191108 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9880164-05de-4884-81a8-c2607d0865ed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a9880164-05de-4884-81a8-c2607d0865ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.191199 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9880164-05de-4884-81a8-c2607d0865ed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a9880164-05de-4884-81a8-c2607d0865ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.191259 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9880164-05de-4884-81a8-c2607d0865ed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a9880164-05de-4884-81a8-c2607d0865ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.215780 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9880164-05de-4884-81a8-c2607d0865ed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a9880164-05de-4884-81a8-c2607d0865ed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.400554 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.621114 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:52 crc kubenswrapper[4835]: I1003 18:16:52.624575 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-4fnq6" Oct 03 18:16:54 crc kubenswrapper[4835]: I1003 18:16:54.683456 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-26hp9" Oct 03 18:16:59 crc kubenswrapper[4835]: I1003 18:16:59.048805 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:59 crc kubenswrapper[4835]: I1003 18:16:59.053824 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:16:59 crc kubenswrapper[4835]: I1003 18:16:59.076012 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mcjr4" Oct 03 18:17:01 crc kubenswrapper[4835]: I1003 18:17:01.029951 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:17:01 crc kubenswrapper[4835]: I1003 18:17:01.036949 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2705556-f411-476d-9d8a-78543bae8dc7-metrics-certs\") pod \"network-metrics-daemon-vlmkl\" (UID: \"e2705556-f411-476d-9d8a-78543bae8dc7\") " pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:17:01 crc kubenswrapper[4835]: I1003 18:17:01.103552 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vlmkl" Oct 03 18:17:05 crc kubenswrapper[4835]: I1003 18:17:05.358767 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:17:05 crc kubenswrapper[4835]: I1003 18:17:05.359121 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:17:06 crc kubenswrapper[4835]: I1003 18:17:06.497376 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:17:16 crc kubenswrapper[4835]: I1003 18:17:16.107503 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 18:17:16 crc kubenswrapper[4835]: E1003 18:17:16.811955 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 18:17:16 crc kubenswrapper[4835]: E1003 18:17:16.812482 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2n7lz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-2dsfn_openshift-marketplace(84574502-e2af-4d0b-83e9-206513b44cbc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 18:17:16 crc kubenswrapper[4835]: E1003 18:17:16.813802 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-2dsfn" podUID="84574502-e2af-4d0b-83e9-206513b44cbc" Oct 03 18:17:17 crc kubenswrapper[4835]: E1003 18:17:17.773002 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-2dsfn" podUID="84574502-e2af-4d0b-83e9-206513b44cbc" Oct 03 18:17:17 crc kubenswrapper[4835]: E1003 18:17:17.856246 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 18:17:17 crc kubenswrapper[4835]: E1003 18:17:17.856381 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-859kj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-frjq8_openshift-marketplace(9fb64305-2843-4f0d-89d9-15c6a2be9353): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 18:17:17 crc kubenswrapper[4835]: E1003 18:17:17.857542 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-frjq8" podUID="9fb64305-2843-4f0d-89d9-15c6a2be9353" Oct 03 18:17:17 crc kubenswrapper[4835]: E1003 18:17:17.861427 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 18:17:17 crc kubenswrapper[4835]: E1003 18:17:17.861553 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4s2nx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-ndfwz_openshift-marketplace(fa6d5cf8-dfde-42f7-9507-48f41bf44b50): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 18:17:17 crc kubenswrapper[4835]: E1003 18:17:17.862736 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-ndfwz" podUID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" Oct 03 18:17:19 crc kubenswrapper[4835]: I1003 18:17:19.575580 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b7btc" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.296286 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-ndfwz" podUID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.296708 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-frjq8" podUID="9fb64305-2843-4f0d-89d9-15c6a2be9353" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.378019 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.378209 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4t8sj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xjj46_openshift-marketplace(9b246bf3-c3d8-41d9-9ae1-660fdc057961): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.379369 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-xjj46" podUID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.381875 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.381989 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tjc7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-n8brz_openshift-marketplace(801fdb9d-0303-46b9-aa79-8c30cda070eb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.383154 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-n8brz" podUID="801fdb9d-0303-46b9-aa79-8c30cda070eb" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.383745 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.383953 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8g9nw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-tr45z_openshift-marketplace(6fc82477-8141-4654-9153-b2a046309e8b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.385084 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-tr45z" podUID="6fc82477-8141-4654-9153-b2a046309e8b" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.863106 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-tr45z" podUID="6fc82477-8141-4654-9153-b2a046309e8b" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.863164 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xjj46" podUID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.863256 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-n8brz" podUID="801fdb9d-0303-46b9-aa79-8c30cda070eb" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.929777 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.930128 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8mhtv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zrf6l_openshift-marketplace(ed6d7146-dd06-4086-8b38-2140c5deeff9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.931327 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-zrf6l" podUID="ed6d7146-dd06-4086-8b38-2140c5deeff9" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.932357 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.932481 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrnws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-xjmsb_openshift-marketplace(64185dc5-d857-4370-8a3b-f1e5ca460bc0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 18:17:20 crc kubenswrapper[4835]: E1003 18:17:20.936182 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-xjmsb" podUID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" Oct 03 18:17:21 crc kubenswrapper[4835]: I1003 18:17:21.028305 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 18:17:21 crc kubenswrapper[4835]: I1003 18:17:21.064956 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vlmkl"] Oct 03 18:17:21 crc kubenswrapper[4835]: W1003 18:17:21.069532 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2705556_f411_476d_9d8a_78543bae8dc7.slice/crio-34e98bcb7b84318459939c866b5a2ea333657ab3618cd6469ad97618827307e0 WatchSource:0}: Error finding container 34e98bcb7b84318459939c866b5a2ea333657ab3618cd6469ad97618827307e0: Status 404 returned error can't find the container with id 34e98bcb7b84318459939c866b5a2ea333657ab3618cd6469ad97618827307e0 Oct 03 18:17:21 crc kubenswrapper[4835]: I1003 18:17:21.448636 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" event={"ID":"e2705556-f411-476d-9d8a-78543bae8dc7","Type":"ContainerStarted","Data":"1472116c136660b0e2ce9f9ba2f7fc656c04156a0d03987a10dfce3b0986f70e"} Oct 03 18:17:21 crc kubenswrapper[4835]: I1003 18:17:21.448922 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" event={"ID":"e2705556-f411-476d-9d8a-78543bae8dc7","Type":"ContainerStarted","Data":"af142a69e2f0816efd5d7471279c9311c40af1b44a6fef12be20f4f843d57ad0"} Oct 03 18:17:21 crc kubenswrapper[4835]: I1003 18:17:21.448932 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vlmkl" event={"ID":"e2705556-f411-476d-9d8a-78543bae8dc7","Type":"ContainerStarted","Data":"34e98bcb7b84318459939c866b5a2ea333657ab3618cd6469ad97618827307e0"} Oct 03 18:17:21 crc kubenswrapper[4835]: I1003 18:17:21.452453 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a9880164-05de-4884-81a8-c2607d0865ed","Type":"ContainerStarted","Data":"1e6b6270496b4ad61b4aac115adb0d161a7c1e56aaefc09abe6f05eaacda46ae"} Oct 03 18:17:21 crc kubenswrapper[4835]: I1003 18:17:21.452479 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a9880164-05de-4884-81a8-c2607d0865ed","Type":"ContainerStarted","Data":"837a2e3290a3510f94c54794c04e198f372d204e122a71eb38e2ae2f4b1c6a5c"} Oct 03 18:17:21 crc kubenswrapper[4835]: E1003 18:17:21.453323 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zrf6l" podUID="ed6d7146-dd06-4086-8b38-2140c5deeff9" Oct 03 18:17:21 crc kubenswrapper[4835]: E1003 18:17:21.454032 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-xjmsb" podUID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" Oct 03 18:17:21 crc kubenswrapper[4835]: I1003 18:17:21.464871 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vlmkl" podStartSLOduration=163.464851643 podStartE2EDuration="2m43.464851643s" podCreationTimestamp="2025-10-03 18:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:17:21.460531839 +0000 UTC m=+183.176472711" watchObservedRunningTime="2025-10-03 18:17:21.464851643 +0000 UTC m=+183.180792515" Oct 03 18:17:21 crc kubenswrapper[4835]: I1003 18:17:21.476263 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=29.476237945 podStartE2EDuration="29.476237945s" podCreationTimestamp="2025-10-03 18:16:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:17:21.473351057 +0000 UTC m=+183.189291929" watchObservedRunningTime="2025-10-03 18:17:21.476237945 +0000 UTC m=+183.192178817" Oct 03 18:17:22 crc kubenswrapper[4835]: I1003 18:17:22.458450 4835 generic.go:334] "Generic (PLEG): container finished" podID="a9880164-05de-4884-81a8-c2607d0865ed" containerID="1e6b6270496b4ad61b4aac115adb0d161a7c1e56aaefc09abe6f05eaacda46ae" exitCode=0 Oct 03 18:17:22 crc kubenswrapper[4835]: I1003 18:17:22.458515 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a9880164-05de-4884-81a8-c2607d0865ed","Type":"ContainerDied","Data":"1e6b6270496b4ad61b4aac115adb0d161a7c1e56aaefc09abe6f05eaacda46ae"} Oct 03 18:17:23 crc kubenswrapper[4835]: I1003 18:17:23.648302 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 18:17:23 crc kubenswrapper[4835]: I1003 18:17:23.794689 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9880164-05de-4884-81a8-c2607d0865ed-kube-api-access\") pod \"a9880164-05de-4884-81a8-c2607d0865ed\" (UID: \"a9880164-05de-4884-81a8-c2607d0865ed\") " Oct 03 18:17:23 crc kubenswrapper[4835]: I1003 18:17:23.794771 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9880164-05de-4884-81a8-c2607d0865ed-kubelet-dir\") pod \"a9880164-05de-4884-81a8-c2607d0865ed\" (UID: \"a9880164-05de-4884-81a8-c2607d0865ed\") " Oct 03 18:17:23 crc kubenswrapper[4835]: I1003 18:17:23.795013 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9880164-05de-4884-81a8-c2607d0865ed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a9880164-05de-4884-81a8-c2607d0865ed" (UID: "a9880164-05de-4884-81a8-c2607d0865ed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:17:23 crc kubenswrapper[4835]: I1003 18:17:23.795243 4835 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9880164-05de-4884-81a8-c2607d0865ed-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:23 crc kubenswrapper[4835]: I1003 18:17:23.801028 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9880164-05de-4884-81a8-c2607d0865ed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a9880164-05de-4884-81a8-c2607d0865ed" (UID: "a9880164-05de-4884-81a8-c2607d0865ed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:17:23 crc kubenswrapper[4835]: I1003 18:17:23.896824 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9880164-05de-4884-81a8-c2607d0865ed-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:24 crc kubenswrapper[4835]: I1003 18:17:24.472301 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a9880164-05de-4884-81a8-c2607d0865ed","Type":"ContainerDied","Data":"837a2e3290a3510f94c54794c04e198f372d204e122a71eb38e2ae2f4b1c6a5c"} Oct 03 18:17:24 crc kubenswrapper[4835]: I1003 18:17:24.472589 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="837a2e3290a3510f94c54794c04e198f372d204e122a71eb38e2ae2f4b1c6a5c" Oct 03 18:17:24 crc kubenswrapper[4835]: I1003 18:17:24.472366 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 18:17:35 crc kubenswrapper[4835]: I1003 18:17:35.358947 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:17:35 crc kubenswrapper[4835]: I1003 18:17:35.359607 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:17:35 crc kubenswrapper[4835]: I1003 18:17:35.525612 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dsfn" event={"ID":"84574502-e2af-4d0b-83e9-206513b44cbc","Type":"ContainerStarted","Data":"0bd45f1bdc01b94858de0a3ddfab7e8c1fa7a63f8091e88f3d22c847479928fc"} Oct 03 18:17:36 crc kubenswrapper[4835]: I1003 18:17:36.531615 4835 generic.go:334] "Generic (PLEG): container finished" podID="84574502-e2af-4d0b-83e9-206513b44cbc" containerID="0bd45f1bdc01b94858de0a3ddfab7e8c1fa7a63f8091e88f3d22c847479928fc" exitCode=0 Oct 03 18:17:36 crc kubenswrapper[4835]: I1003 18:17:36.531651 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dsfn" event={"ID":"84574502-e2af-4d0b-83e9-206513b44cbc","Type":"ContainerDied","Data":"0bd45f1bdc01b94858de0a3ddfab7e8c1fa7a63f8091e88f3d22c847479928fc"} Oct 03 18:17:39 crc kubenswrapper[4835]: I1003 18:17:39.564470 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndfwz" event={"ID":"fa6d5cf8-dfde-42f7-9507-48f41bf44b50","Type":"ContainerStarted","Data":"d6e0faa30ca410cd6997f60a41b76ffbcb3bc8b25eb3aeb4aa5b54212bb62a8e"} Oct 03 18:17:39 crc kubenswrapper[4835]: I1003 18:17:39.569262 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjmsb" event={"ID":"64185dc5-d857-4370-8a3b-f1e5ca460bc0","Type":"ContainerStarted","Data":"da79a2d72a74333dd261b6715974be692e7d0432479a5635588122140131cc32"} Oct 03 18:17:39 crc kubenswrapper[4835]: I1003 18:17:39.575170 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8brz" event={"ID":"801fdb9d-0303-46b9-aa79-8c30cda070eb","Type":"ContainerStarted","Data":"83d461505ea7acd3e1ca60e9b418e3f82905c45bfcc319eb86bc49700330d549"} Oct 03 18:17:39 crc kubenswrapper[4835]: I1003 18:17:39.585058 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frjq8" event={"ID":"9fb64305-2843-4f0d-89d9-15c6a2be9353","Type":"ContainerStarted","Data":"b4b9d9b029dcfb24723f3a5681a20772b694fec3244b54393e559d02fa61667a"} Oct 03 18:17:39 crc kubenswrapper[4835]: I1003 18:17:39.587094 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjj46" event={"ID":"9b246bf3-c3d8-41d9-9ae1-660fdc057961","Type":"ContainerStarted","Data":"84e65d368e36f01bd7ce2a66235690b063a8c5e49324c4f6d01417438728ac64"} Oct 03 18:17:39 crc kubenswrapper[4835]: I1003 18:17:39.588748 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr45z" event={"ID":"6fc82477-8141-4654-9153-b2a046309e8b","Type":"ContainerStarted","Data":"0c4f7bdaf5ccec3e9db60f01a7dd1ef3f1e49b205ac3de94f127c7fd91f2ae4d"} Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.597249 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dsfn" event={"ID":"84574502-e2af-4d0b-83e9-206513b44cbc","Type":"ContainerStarted","Data":"bcff6a7c37f82d45529c9156f3ad3dfb60465821e76e5edfe611d2f100bef691"} Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.601134 4835 generic.go:334] "Generic (PLEG): container finished" podID="801fdb9d-0303-46b9-aa79-8c30cda070eb" containerID="83d461505ea7acd3e1ca60e9b418e3f82905c45bfcc319eb86bc49700330d549" exitCode=0 Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.601205 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8brz" event={"ID":"801fdb9d-0303-46b9-aa79-8c30cda070eb","Type":"ContainerDied","Data":"83d461505ea7acd3e1ca60e9b418e3f82905c45bfcc319eb86bc49700330d549"} Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.613352 4835 generic.go:334] "Generic (PLEG): container finished" podID="9fb64305-2843-4f0d-89d9-15c6a2be9353" containerID="b4b9d9b029dcfb24723f3a5681a20772b694fec3244b54393e559d02fa61667a" exitCode=0 Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.613412 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frjq8" event={"ID":"9fb64305-2843-4f0d-89d9-15c6a2be9353","Type":"ContainerDied","Data":"b4b9d9b029dcfb24723f3a5681a20772b694fec3244b54393e559d02fa61667a"} Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.620497 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2dsfn" podStartSLOduration=3.9656332020000002 podStartE2EDuration="56.620476866s" podCreationTimestamp="2025-10-03 18:16:44 +0000 UTC" firstStartedPulling="2025-10-03 18:16:47.098474846 +0000 UTC m=+148.814415718" lastFinishedPulling="2025-10-03 18:17:39.75331851 +0000 UTC m=+201.469259382" observedRunningTime="2025-10-03 18:17:40.617902057 +0000 UTC m=+202.333842929" watchObservedRunningTime="2025-10-03 18:17:40.620476866 +0000 UTC m=+202.336417738" Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.621560 4835 generic.go:334] "Generic (PLEG): container finished" podID="6fc82477-8141-4654-9153-b2a046309e8b" containerID="0c4f7bdaf5ccec3e9db60f01a7dd1ef3f1e49b205ac3de94f127c7fd91f2ae4d" exitCode=0 Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.621637 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr45z" event={"ID":"6fc82477-8141-4654-9153-b2a046309e8b","Type":"ContainerDied","Data":"0c4f7bdaf5ccec3e9db60f01a7dd1ef3f1e49b205ac3de94f127c7fd91f2ae4d"} Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.624504 4835 generic.go:334] "Generic (PLEG): container finished" podID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" containerID="84e65d368e36f01bd7ce2a66235690b063a8c5e49324c4f6d01417438728ac64" exitCode=0 Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.624571 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjj46" event={"ID":"9b246bf3-c3d8-41d9-9ae1-660fdc057961","Type":"ContainerDied","Data":"84e65d368e36f01bd7ce2a66235690b063a8c5e49324c4f6d01417438728ac64"} Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.628916 4835 generic.go:334] "Generic (PLEG): container finished" podID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" containerID="d6e0faa30ca410cd6997f60a41b76ffbcb3bc8b25eb3aeb4aa5b54212bb62a8e" exitCode=0 Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.628981 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndfwz" event={"ID":"fa6d5cf8-dfde-42f7-9507-48f41bf44b50","Type":"ContainerDied","Data":"d6e0faa30ca410cd6997f60a41b76ffbcb3bc8b25eb3aeb4aa5b54212bb62a8e"} Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.637840 4835 generic.go:334] "Generic (PLEG): container finished" podID="ed6d7146-dd06-4086-8b38-2140c5deeff9" containerID="83a08a436c027e500ff2159dcd4f23b67ebdd325eca5c9e7ecf14e3fa77a2344" exitCode=0 Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.637932 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrf6l" event={"ID":"ed6d7146-dd06-4086-8b38-2140c5deeff9","Type":"ContainerDied","Data":"83a08a436c027e500ff2159dcd4f23b67ebdd325eca5c9e7ecf14e3fa77a2344"} Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.640367 4835 generic.go:334] "Generic (PLEG): container finished" podID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" containerID="da79a2d72a74333dd261b6715974be692e7d0432479a5635588122140131cc32" exitCode=0 Oct 03 18:17:40 crc kubenswrapper[4835]: I1003 18:17:40.640406 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjmsb" event={"ID":"64185dc5-d857-4370-8a3b-f1e5ca460bc0","Type":"ContainerDied","Data":"da79a2d72a74333dd261b6715974be692e7d0432479a5635588122140131cc32"} Oct 03 18:17:41 crc kubenswrapper[4835]: I1003 18:17:41.647981 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrf6l" event={"ID":"ed6d7146-dd06-4086-8b38-2140c5deeff9","Type":"ContainerStarted","Data":"70ff4885bc0cd2cf0d5f3caf9c0544c4117cf5d70a7ac20854343bbb48a18986"} Oct 03 18:17:41 crc kubenswrapper[4835]: I1003 18:17:41.649938 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8brz" event={"ID":"801fdb9d-0303-46b9-aa79-8c30cda070eb","Type":"ContainerStarted","Data":"5c78f2004076a8578f224598072f4df4f279aff9d3185832bafcb9466ef75302"} Oct 03 18:17:41 crc kubenswrapper[4835]: I1003 18:17:41.653108 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frjq8" event={"ID":"9fb64305-2843-4f0d-89d9-15c6a2be9353","Type":"ContainerStarted","Data":"8256d5dcfe5d00bc9dd8dd9c9fc015b6b2acf2347eba9cabfe02d675fd2f0014"} Oct 03 18:17:41 crc kubenswrapper[4835]: I1003 18:17:41.655176 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjj46" event={"ID":"9b246bf3-c3d8-41d9-9ae1-660fdc057961","Type":"ContainerStarted","Data":"66c205247c1015e912787eaacf2b023ee5cbc2544074cf330e681b53cd0d4c4f"} Oct 03 18:17:41 crc kubenswrapper[4835]: I1003 18:17:41.657115 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr45z" event={"ID":"6fc82477-8141-4654-9153-b2a046309e8b","Type":"ContainerStarted","Data":"1a21044041fd8a83ac4a2b8d08cf845af1adc75efa16269436b2cbe4f421b5bc"} Oct 03 18:17:41 crc kubenswrapper[4835]: I1003 18:17:41.659816 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndfwz" event={"ID":"fa6d5cf8-dfde-42f7-9507-48f41bf44b50","Type":"ContainerStarted","Data":"ee10c525603d688141ed43bd1e74e4bcff8f2e3e00b7b7669475a92eab4881b6"} Oct 03 18:17:41 crc kubenswrapper[4835]: I1003 18:17:41.671118 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zrf6l" podStartSLOduration=2.504826741 podStartE2EDuration="55.671102647s" podCreationTimestamp="2025-10-03 18:16:46 +0000 UTC" firstStartedPulling="2025-10-03 18:16:48.213317065 +0000 UTC m=+149.929257937" lastFinishedPulling="2025-10-03 18:17:41.379592971 +0000 UTC m=+203.095533843" observedRunningTime="2025-10-03 18:17:41.667380359 +0000 UTC m=+203.383321231" watchObservedRunningTime="2025-10-03 18:17:41.671102647 +0000 UTC m=+203.387043519" Oct 03 18:17:41 crc kubenswrapper[4835]: I1003 18:17:41.689604 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-frjq8" podStartSLOduration=3.624267222 podStartE2EDuration="57.689589155s" podCreationTimestamp="2025-10-03 18:16:44 +0000 UTC" firstStartedPulling="2025-10-03 18:16:47.131358045 +0000 UTC m=+148.847298907" lastFinishedPulling="2025-10-03 18:17:41.196679978 +0000 UTC m=+202.912620840" observedRunningTime="2025-10-03 18:17:41.688416983 +0000 UTC m=+203.404357855" watchObservedRunningTime="2025-10-03 18:17:41.689589155 +0000 UTC m=+203.405530027" Oct 03 18:17:41 crc kubenswrapper[4835]: I1003 18:17:41.708748 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tr45z" podStartSLOduration=2.807907401 podStartE2EDuration="54.70873239s" podCreationTimestamp="2025-10-03 18:16:47 +0000 UTC" firstStartedPulling="2025-10-03 18:16:49.270426779 +0000 UTC m=+150.986367651" lastFinishedPulling="2025-10-03 18:17:41.171251768 +0000 UTC m=+202.887192640" observedRunningTime="2025-10-03 18:17:41.705062592 +0000 UTC m=+203.421003464" watchObservedRunningTime="2025-10-03 18:17:41.70873239 +0000 UTC m=+203.424673262" Oct 03 18:17:41 crc kubenswrapper[4835]: I1003 18:17:41.721137 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ndfwz" podStartSLOduration=2.293072768 podStartE2EDuration="57.721121366s" podCreationTimestamp="2025-10-03 18:16:44 +0000 UTC" firstStartedPulling="2025-10-03 18:16:46.028009299 +0000 UTC m=+147.743950171" lastFinishedPulling="2025-10-03 18:17:41.456057897 +0000 UTC m=+203.171998769" observedRunningTime="2025-10-03 18:17:41.717521091 +0000 UTC m=+203.433461963" watchObservedRunningTime="2025-10-03 18:17:41.721121366 +0000 UTC m=+203.437062238" Oct 03 18:17:41 crc kubenswrapper[4835]: I1003 18:17:41.749307 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xjj46" podStartSLOduration=3.6268059409999998 podStartE2EDuration="57.749290959s" podCreationTimestamp="2025-10-03 18:16:44 +0000 UTC" firstStartedPulling="2025-10-03 18:16:47.168745173 +0000 UTC m=+148.884686045" lastFinishedPulling="2025-10-03 18:17:41.291230191 +0000 UTC m=+203.007171063" observedRunningTime="2025-10-03 18:17:41.748741494 +0000 UTC m=+203.464682356" watchObservedRunningTime="2025-10-03 18:17:41.749290959 +0000 UTC m=+203.465231841" Oct 03 18:17:41 crc kubenswrapper[4835]: I1003 18:17:41.768123 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n8brz" podStartSLOduration=2.611651655 podStartE2EDuration="54.768103815s" podCreationTimestamp="2025-10-03 18:16:47 +0000 UTC" firstStartedPulling="2025-10-03 18:16:49.243344783 +0000 UTC m=+150.959285655" lastFinishedPulling="2025-10-03 18:17:41.399796943 +0000 UTC m=+203.115737815" observedRunningTime="2025-10-03 18:17:41.765436014 +0000 UTC m=+203.481376886" watchObservedRunningTime="2025-10-03 18:17:41.768103815 +0000 UTC m=+203.484044687" Oct 03 18:17:42 crc kubenswrapper[4835]: I1003 18:17:42.667784 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjmsb" event={"ID":"64185dc5-d857-4370-8a3b-f1e5ca460bc0","Type":"ContainerStarted","Data":"d2790e8442b4e454c81b4210af35519cf7f9b566fcb9673803236dfab260ec01"} Oct 03 18:17:42 crc kubenswrapper[4835]: I1003 18:17:42.687636 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xjmsb" podStartSLOduration=3.35355026 podStartE2EDuration="56.68761855s" podCreationTimestamp="2025-10-03 18:16:46 +0000 UTC" firstStartedPulling="2025-10-03 18:16:48.213336636 +0000 UTC m=+149.929277508" lastFinishedPulling="2025-10-03 18:17:41.547404926 +0000 UTC m=+203.263345798" observedRunningTime="2025-10-03 18:17:42.685603427 +0000 UTC m=+204.401544299" watchObservedRunningTime="2025-10-03 18:17:42.68761855 +0000 UTC m=+204.403559422" Oct 03 18:17:44 crc kubenswrapper[4835]: I1003 18:17:44.561204 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:17:44 crc kubenswrapper[4835]: I1003 18:17:44.561755 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:17:44 crc kubenswrapper[4835]: I1003 18:17:44.726552 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:17:44 crc kubenswrapper[4835]: I1003 18:17:44.726605 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:17:44 crc kubenswrapper[4835]: I1003 18:17:44.838866 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:17:44 crc kubenswrapper[4835]: I1003 18:17:44.839305 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:17:45 crc kubenswrapper[4835]: I1003 18:17:45.066129 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:17:45 crc kubenswrapper[4835]: I1003 18:17:45.066473 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:17:45 crc kubenswrapper[4835]: I1003 18:17:45.107439 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:17:45 crc kubenswrapper[4835]: I1003 18:17:45.135228 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:17:45 crc kubenswrapper[4835]: I1003 18:17:45.135297 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:17:45 crc kubenswrapper[4835]: I1003 18:17:45.173920 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:17:45 crc kubenswrapper[4835]: I1003 18:17:45.724536 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:17:46 crc kubenswrapper[4835]: I1003 18:17:46.720664 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:17:46 crc kubenswrapper[4835]: I1003 18:17:46.720706 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:17:46 crc kubenswrapper[4835]: I1003 18:17:46.756399 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:17:47 crc kubenswrapper[4835]: I1003 18:17:47.144363 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:17:47 crc kubenswrapper[4835]: I1003 18:17:47.144637 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:17:47 crc kubenswrapper[4835]: I1003 18:17:47.177510 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:17:47 crc kubenswrapper[4835]: I1003 18:17:47.704284 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:17:47 crc kubenswrapper[4835]: I1003 18:17:47.704491 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:17:47 crc kubenswrapper[4835]: I1003 18:17:47.730881 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:17:47 crc kubenswrapper[4835]: I1003 18:17:47.737723 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:17:47 crc kubenswrapper[4835]: I1003 18:17:47.740733 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:17:48 crc kubenswrapper[4835]: I1003 18:17:48.123486 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:17:48 crc kubenswrapper[4835]: I1003 18:17:48.123536 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:17:48 crc kubenswrapper[4835]: I1003 18:17:48.169247 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:17:48 crc kubenswrapper[4835]: I1003 18:17:48.512762 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2dsfn"] Oct 03 18:17:48 crc kubenswrapper[4835]: I1003 18:17:48.698941 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2dsfn" podUID="84574502-e2af-4d0b-83e9-206513b44cbc" containerName="registry-server" containerID="cri-o://bcff6a7c37f82d45529c9156f3ad3dfb60465821e76e5edfe611d2f100bef691" gracePeriod=2 Oct 03 18:17:48 crc kubenswrapper[4835]: I1003 18:17:48.735912 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:17:48 crc kubenswrapper[4835]: I1003 18:17:48.748304 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.024901 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.128370 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n7lz\" (UniqueName: \"kubernetes.io/projected/84574502-e2af-4d0b-83e9-206513b44cbc-kube-api-access-2n7lz\") pod \"84574502-e2af-4d0b-83e9-206513b44cbc\" (UID: \"84574502-e2af-4d0b-83e9-206513b44cbc\") " Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.128458 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84574502-e2af-4d0b-83e9-206513b44cbc-utilities\") pod \"84574502-e2af-4d0b-83e9-206513b44cbc\" (UID: \"84574502-e2af-4d0b-83e9-206513b44cbc\") " Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.128520 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84574502-e2af-4d0b-83e9-206513b44cbc-catalog-content\") pod \"84574502-e2af-4d0b-83e9-206513b44cbc\" (UID: \"84574502-e2af-4d0b-83e9-206513b44cbc\") " Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.130233 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84574502-e2af-4d0b-83e9-206513b44cbc-utilities" (OuterVolumeSpecName: "utilities") pod "84574502-e2af-4d0b-83e9-206513b44cbc" (UID: "84574502-e2af-4d0b-83e9-206513b44cbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.134543 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84574502-e2af-4d0b-83e9-206513b44cbc-kube-api-access-2n7lz" (OuterVolumeSpecName: "kube-api-access-2n7lz") pod "84574502-e2af-4d0b-83e9-206513b44cbc" (UID: "84574502-e2af-4d0b-83e9-206513b44cbc"). InnerVolumeSpecName "kube-api-access-2n7lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.179303 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84574502-e2af-4d0b-83e9-206513b44cbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84574502-e2af-4d0b-83e9-206513b44cbc" (UID: "84574502-e2af-4d0b-83e9-206513b44cbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.229689 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n7lz\" (UniqueName: \"kubernetes.io/projected/84574502-e2af-4d0b-83e9-206513b44cbc-kube-api-access-2n7lz\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.229722 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84574502-e2af-4d0b-83e9-206513b44cbc-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.229732 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84574502-e2af-4d0b-83e9-206513b44cbc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.707435 4835 generic.go:334] "Generic (PLEG): container finished" podID="84574502-e2af-4d0b-83e9-206513b44cbc" containerID="bcff6a7c37f82d45529c9156f3ad3dfb60465821e76e5edfe611d2f100bef691" exitCode=0 Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.707510 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2dsfn" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.707564 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dsfn" event={"ID":"84574502-e2af-4d0b-83e9-206513b44cbc","Type":"ContainerDied","Data":"bcff6a7c37f82d45529c9156f3ad3dfb60465821e76e5edfe611d2f100bef691"} Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.707595 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2dsfn" event={"ID":"84574502-e2af-4d0b-83e9-206513b44cbc","Type":"ContainerDied","Data":"ba4a3876aa2746b79f30fd4b4a95404ae47827f9a378f431fa6e66d3e945817f"} Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.707612 4835 scope.go:117] "RemoveContainer" containerID="bcff6a7c37f82d45529c9156f3ad3dfb60465821e76e5edfe611d2f100bef691" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.722093 4835 scope.go:117] "RemoveContainer" containerID="0bd45f1bdc01b94858de0a3ddfab7e8c1fa7a63f8091e88f3d22c847479928fc" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.737861 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2dsfn"] Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.739708 4835 scope.go:117] "RemoveContainer" containerID="d99274bb05f2541fdd40265c715397c0f020a5e7101e72bfbfdb058862383c23" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.740736 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2dsfn"] Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.756860 4835 scope.go:117] "RemoveContainer" containerID="bcff6a7c37f82d45529c9156f3ad3dfb60465821e76e5edfe611d2f100bef691" Oct 03 18:17:49 crc kubenswrapper[4835]: E1003 18:17:49.757212 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcff6a7c37f82d45529c9156f3ad3dfb60465821e76e5edfe611d2f100bef691\": container with ID starting with bcff6a7c37f82d45529c9156f3ad3dfb60465821e76e5edfe611d2f100bef691 not found: ID does not exist" containerID="bcff6a7c37f82d45529c9156f3ad3dfb60465821e76e5edfe611d2f100bef691" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.757265 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcff6a7c37f82d45529c9156f3ad3dfb60465821e76e5edfe611d2f100bef691"} err="failed to get container status \"bcff6a7c37f82d45529c9156f3ad3dfb60465821e76e5edfe611d2f100bef691\": rpc error: code = NotFound desc = could not find container \"bcff6a7c37f82d45529c9156f3ad3dfb60465821e76e5edfe611d2f100bef691\": container with ID starting with bcff6a7c37f82d45529c9156f3ad3dfb60465821e76e5edfe611d2f100bef691 not found: ID does not exist" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.757306 4835 scope.go:117] "RemoveContainer" containerID="0bd45f1bdc01b94858de0a3ddfab7e8c1fa7a63f8091e88f3d22c847479928fc" Oct 03 18:17:49 crc kubenswrapper[4835]: E1003 18:17:49.757745 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bd45f1bdc01b94858de0a3ddfab7e8c1fa7a63f8091e88f3d22c847479928fc\": container with ID starting with 0bd45f1bdc01b94858de0a3ddfab7e8c1fa7a63f8091e88f3d22c847479928fc not found: ID does not exist" containerID="0bd45f1bdc01b94858de0a3ddfab7e8c1fa7a63f8091e88f3d22c847479928fc" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.757781 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd45f1bdc01b94858de0a3ddfab7e8c1fa7a63f8091e88f3d22c847479928fc"} err="failed to get container status \"0bd45f1bdc01b94858de0a3ddfab7e8c1fa7a63f8091e88f3d22c847479928fc\": rpc error: code = NotFound desc = could not find container \"0bd45f1bdc01b94858de0a3ddfab7e8c1fa7a63f8091e88f3d22c847479928fc\": container with ID starting with 0bd45f1bdc01b94858de0a3ddfab7e8c1fa7a63f8091e88f3d22c847479928fc not found: ID does not exist" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.757803 4835 scope.go:117] "RemoveContainer" containerID="d99274bb05f2541fdd40265c715397c0f020a5e7101e72bfbfdb058862383c23" Oct 03 18:17:49 crc kubenswrapper[4835]: E1003 18:17:49.758118 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d99274bb05f2541fdd40265c715397c0f020a5e7101e72bfbfdb058862383c23\": container with ID starting with d99274bb05f2541fdd40265c715397c0f020a5e7101e72bfbfdb058862383c23 not found: ID does not exist" containerID="d99274bb05f2541fdd40265c715397c0f020a5e7101e72bfbfdb058862383c23" Oct 03 18:17:49 crc kubenswrapper[4835]: I1003 18:17:49.758158 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d99274bb05f2541fdd40265c715397c0f020a5e7101e72bfbfdb058862383c23"} err="failed to get container status \"d99274bb05f2541fdd40265c715397c0f020a5e7101e72bfbfdb058862383c23\": rpc error: code = NotFound desc = could not find container \"d99274bb05f2541fdd40265c715397c0f020a5e7101e72bfbfdb058862383c23\": container with ID starting with d99274bb05f2541fdd40265c715397c0f020a5e7101e72bfbfdb058862383c23 not found: ID does not exist" Oct 03 18:17:50 crc kubenswrapper[4835]: I1003 18:17:50.883487 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84574502-e2af-4d0b-83e9-206513b44cbc" path="/var/lib/kubelet/pods/84574502-e2af-4d0b-83e9-206513b44cbc/volumes" Oct 03 18:17:50 crc kubenswrapper[4835]: I1003 18:17:50.907634 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjmsb"] Oct 03 18:17:50 crc kubenswrapper[4835]: I1003 18:17:50.907847 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xjmsb" podUID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" containerName="registry-server" containerID="cri-o://d2790e8442b4e454c81b4210af35519cf7f9b566fcb9673803236dfab260ec01" gracePeriod=2 Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.235604 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.357507 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64185dc5-d857-4370-8a3b-f1e5ca460bc0-catalog-content\") pod \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\" (UID: \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\") " Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.357563 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrnws\" (UniqueName: \"kubernetes.io/projected/64185dc5-d857-4370-8a3b-f1e5ca460bc0-kube-api-access-qrnws\") pod \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\" (UID: \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\") " Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.357700 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64185dc5-d857-4370-8a3b-f1e5ca460bc0-utilities\") pod \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\" (UID: \"64185dc5-d857-4370-8a3b-f1e5ca460bc0\") " Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.358591 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64185dc5-d857-4370-8a3b-f1e5ca460bc0-utilities" (OuterVolumeSpecName: "utilities") pod "64185dc5-d857-4370-8a3b-f1e5ca460bc0" (UID: "64185dc5-d857-4370-8a3b-f1e5ca460bc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.363129 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64185dc5-d857-4370-8a3b-f1e5ca460bc0-kube-api-access-qrnws" (OuterVolumeSpecName: "kube-api-access-qrnws") pod "64185dc5-d857-4370-8a3b-f1e5ca460bc0" (UID: "64185dc5-d857-4370-8a3b-f1e5ca460bc0"). InnerVolumeSpecName "kube-api-access-qrnws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.370897 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64185dc5-d857-4370-8a3b-f1e5ca460bc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64185dc5-d857-4370-8a3b-f1e5ca460bc0" (UID: "64185dc5-d857-4370-8a3b-f1e5ca460bc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.459368 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64185dc5-d857-4370-8a3b-f1e5ca460bc0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.459439 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrnws\" (UniqueName: \"kubernetes.io/projected/64185dc5-d857-4370-8a3b-f1e5ca460bc0-kube-api-access-qrnws\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.459453 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64185dc5-d857-4370-8a3b-f1e5ca460bc0-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.720539 4835 generic.go:334] "Generic (PLEG): container finished" podID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" containerID="d2790e8442b4e454c81b4210af35519cf7f9b566fcb9673803236dfab260ec01" exitCode=0 Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.720580 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjmsb" event={"ID":"64185dc5-d857-4370-8a3b-f1e5ca460bc0","Type":"ContainerDied","Data":"d2790e8442b4e454c81b4210af35519cf7f9b566fcb9673803236dfab260ec01"} Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.720605 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjmsb" event={"ID":"64185dc5-d857-4370-8a3b-f1e5ca460bc0","Type":"ContainerDied","Data":"d0031570191d37210a919a84ba898b8197c3acfc0b1c873d0787eda805c12b21"} Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.720603 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjmsb" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.720621 4835 scope.go:117] "RemoveContainer" containerID="d2790e8442b4e454c81b4210af35519cf7f9b566fcb9673803236dfab260ec01" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.741561 4835 scope.go:117] "RemoveContainer" containerID="da79a2d72a74333dd261b6715974be692e7d0432479a5635588122140131cc32" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.760967 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjmsb"] Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.767617 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjmsb"] Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.770818 4835 scope.go:117] "RemoveContainer" containerID="15533908bc132e910a993ef83532e71105b08548c9cfd28c4fa5559b359e819c" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.790108 4835 scope.go:117] "RemoveContainer" containerID="d2790e8442b4e454c81b4210af35519cf7f9b566fcb9673803236dfab260ec01" Oct 03 18:17:51 crc kubenswrapper[4835]: E1003 18:17:51.792498 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2790e8442b4e454c81b4210af35519cf7f9b566fcb9673803236dfab260ec01\": container with ID starting with d2790e8442b4e454c81b4210af35519cf7f9b566fcb9673803236dfab260ec01 not found: ID does not exist" containerID="d2790e8442b4e454c81b4210af35519cf7f9b566fcb9673803236dfab260ec01" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.792553 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2790e8442b4e454c81b4210af35519cf7f9b566fcb9673803236dfab260ec01"} err="failed to get container status \"d2790e8442b4e454c81b4210af35519cf7f9b566fcb9673803236dfab260ec01\": rpc error: code = NotFound desc = could not find container \"d2790e8442b4e454c81b4210af35519cf7f9b566fcb9673803236dfab260ec01\": container with ID starting with d2790e8442b4e454c81b4210af35519cf7f9b566fcb9673803236dfab260ec01 not found: ID does not exist" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.792605 4835 scope.go:117] "RemoveContainer" containerID="da79a2d72a74333dd261b6715974be692e7d0432479a5635588122140131cc32" Oct 03 18:17:51 crc kubenswrapper[4835]: E1003 18:17:51.792959 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da79a2d72a74333dd261b6715974be692e7d0432479a5635588122140131cc32\": container with ID starting with da79a2d72a74333dd261b6715974be692e7d0432479a5635588122140131cc32 not found: ID does not exist" containerID="da79a2d72a74333dd261b6715974be692e7d0432479a5635588122140131cc32" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.792998 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da79a2d72a74333dd261b6715974be692e7d0432479a5635588122140131cc32"} err="failed to get container status \"da79a2d72a74333dd261b6715974be692e7d0432479a5635588122140131cc32\": rpc error: code = NotFound desc = could not find container \"da79a2d72a74333dd261b6715974be692e7d0432479a5635588122140131cc32\": container with ID starting with da79a2d72a74333dd261b6715974be692e7d0432479a5635588122140131cc32 not found: ID does not exist" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.793027 4835 scope.go:117] "RemoveContainer" containerID="15533908bc132e910a993ef83532e71105b08548c9cfd28c4fa5559b359e819c" Oct 03 18:17:51 crc kubenswrapper[4835]: E1003 18:17:51.793333 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15533908bc132e910a993ef83532e71105b08548c9cfd28c4fa5559b359e819c\": container with ID starting with 15533908bc132e910a993ef83532e71105b08548c9cfd28c4fa5559b359e819c not found: ID does not exist" containerID="15533908bc132e910a993ef83532e71105b08548c9cfd28c4fa5559b359e819c" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.793362 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15533908bc132e910a993ef83532e71105b08548c9cfd28c4fa5559b359e819c"} err="failed to get container status \"15533908bc132e910a993ef83532e71105b08548c9cfd28c4fa5559b359e819c\": rpc error: code = NotFound desc = could not find container \"15533908bc132e910a993ef83532e71105b08548c9cfd28c4fa5559b359e819c\": container with ID starting with 15533908bc132e910a993ef83532e71105b08548c9cfd28c4fa5559b359e819c not found: ID does not exist" Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.914928 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8brz"] Oct 03 18:17:51 crc kubenswrapper[4835]: I1003 18:17:51.916203 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n8brz" podUID="801fdb9d-0303-46b9-aa79-8c30cda070eb" containerName="registry-server" containerID="cri-o://5c78f2004076a8578f224598072f4df4f279aff9d3185832bafcb9466ef75302" gracePeriod=2 Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.729408 4835 generic.go:334] "Generic (PLEG): container finished" podID="801fdb9d-0303-46b9-aa79-8c30cda070eb" containerID="5c78f2004076a8578f224598072f4df4f279aff9d3185832bafcb9466ef75302" exitCode=0 Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.729478 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8brz" event={"ID":"801fdb9d-0303-46b9-aa79-8c30cda070eb","Type":"ContainerDied","Data":"5c78f2004076a8578f224598072f4df4f279aff9d3185832bafcb9466ef75302"} Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.729801 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8brz" event={"ID":"801fdb9d-0303-46b9-aa79-8c30cda070eb","Type":"ContainerDied","Data":"76b5bc562aac1b0caf783a70b9a3a091e993072ec0c4a29f20ad30eeab912a26"} Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.729814 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76b5bc562aac1b0caf783a70b9a3a091e993072ec0c4a29f20ad30eeab912a26" Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.737688 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.875501 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/801fdb9d-0303-46b9-aa79-8c30cda070eb-catalog-content\") pod \"801fdb9d-0303-46b9-aa79-8c30cda070eb\" (UID: \"801fdb9d-0303-46b9-aa79-8c30cda070eb\") " Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.875804 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjc7g\" (UniqueName: \"kubernetes.io/projected/801fdb9d-0303-46b9-aa79-8c30cda070eb-kube-api-access-tjc7g\") pod \"801fdb9d-0303-46b9-aa79-8c30cda070eb\" (UID: \"801fdb9d-0303-46b9-aa79-8c30cda070eb\") " Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.875988 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/801fdb9d-0303-46b9-aa79-8c30cda070eb-utilities\") pod \"801fdb9d-0303-46b9-aa79-8c30cda070eb\" (UID: \"801fdb9d-0303-46b9-aa79-8c30cda070eb\") " Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.877279 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/801fdb9d-0303-46b9-aa79-8c30cda070eb-utilities" (OuterVolumeSpecName: "utilities") pod "801fdb9d-0303-46b9-aa79-8c30cda070eb" (UID: "801fdb9d-0303-46b9-aa79-8c30cda070eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.883220 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801fdb9d-0303-46b9-aa79-8c30cda070eb-kube-api-access-tjc7g" (OuterVolumeSpecName: "kube-api-access-tjc7g") pod "801fdb9d-0303-46b9-aa79-8c30cda070eb" (UID: "801fdb9d-0303-46b9-aa79-8c30cda070eb"). InnerVolumeSpecName "kube-api-access-tjc7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.885438 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" path="/var/lib/kubelet/pods/64185dc5-d857-4370-8a3b-f1e5ca460bc0/volumes" Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.972799 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/801fdb9d-0303-46b9-aa79-8c30cda070eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "801fdb9d-0303-46b9-aa79-8c30cda070eb" (UID: "801fdb9d-0303-46b9-aa79-8c30cda070eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.977406 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/801fdb9d-0303-46b9-aa79-8c30cda070eb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.977446 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjc7g\" (UniqueName: \"kubernetes.io/projected/801fdb9d-0303-46b9-aa79-8c30cda070eb-kube-api-access-tjc7g\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:52 crc kubenswrapper[4835]: I1003 18:17:52.977463 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/801fdb9d-0303-46b9-aa79-8c30cda070eb-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:53 crc kubenswrapper[4835]: I1003 18:17:53.734419 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8brz" Oct 03 18:17:53 crc kubenswrapper[4835]: I1003 18:17:53.780835 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8brz"] Oct 03 18:17:53 crc kubenswrapper[4835]: I1003 18:17:53.784372 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n8brz"] Oct 03 18:17:54 crc kubenswrapper[4835]: I1003 18:17:54.602428 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:17:54 crc kubenswrapper[4835]: I1003 18:17:54.765995 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:17:54 crc kubenswrapper[4835]: I1003 18:17:54.883321 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801fdb9d-0303-46b9-aa79-8c30cda070eb" path="/var/lib/kubelet/pods/801fdb9d-0303-46b9-aa79-8c30cda070eb/volumes" Oct 03 18:17:55 crc kubenswrapper[4835]: I1003 18:17:55.170404 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:17:57 crc kubenswrapper[4835]: I1003 18:17:57.969004 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p7wfs"] Oct 03 18:17:58 crc kubenswrapper[4835]: I1003 18:17:58.709986 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-frjq8"] Oct 03 18:17:58 crc kubenswrapper[4835]: I1003 18:17:58.710223 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-frjq8" podUID="9fb64305-2843-4f0d-89d9-15c6a2be9353" containerName="registry-server" containerID="cri-o://8256d5dcfe5d00bc9dd8dd9c9fc015b6b2acf2347eba9cabfe02d675fd2f0014" gracePeriod=2 Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.077205 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.152734 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb64305-2843-4f0d-89d9-15c6a2be9353-catalog-content\") pod \"9fb64305-2843-4f0d-89d9-15c6a2be9353\" (UID: \"9fb64305-2843-4f0d-89d9-15c6a2be9353\") " Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.152771 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb64305-2843-4f0d-89d9-15c6a2be9353-utilities\") pod \"9fb64305-2843-4f0d-89d9-15c6a2be9353\" (UID: \"9fb64305-2843-4f0d-89d9-15c6a2be9353\") " Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.152833 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-859kj\" (UniqueName: \"kubernetes.io/projected/9fb64305-2843-4f0d-89d9-15c6a2be9353-kube-api-access-859kj\") pod \"9fb64305-2843-4f0d-89d9-15c6a2be9353\" (UID: \"9fb64305-2843-4f0d-89d9-15c6a2be9353\") " Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.153581 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb64305-2843-4f0d-89d9-15c6a2be9353-utilities" (OuterVolumeSpecName: "utilities") pod "9fb64305-2843-4f0d-89d9-15c6a2be9353" (UID: "9fb64305-2843-4f0d-89d9-15c6a2be9353"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.157845 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb64305-2843-4f0d-89d9-15c6a2be9353-kube-api-access-859kj" (OuterVolumeSpecName: "kube-api-access-859kj") pod "9fb64305-2843-4f0d-89d9-15c6a2be9353" (UID: "9fb64305-2843-4f0d-89d9-15c6a2be9353"). InnerVolumeSpecName "kube-api-access-859kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.201518 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb64305-2843-4f0d-89d9-15c6a2be9353-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fb64305-2843-4f0d-89d9-15c6a2be9353" (UID: "9fb64305-2843-4f0d-89d9-15c6a2be9353"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.253387 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-859kj\" (UniqueName: \"kubernetes.io/projected/9fb64305-2843-4f0d-89d9-15c6a2be9353-kube-api-access-859kj\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.253411 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb64305-2843-4f0d-89d9-15c6a2be9353-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.253421 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb64305-2843-4f0d-89d9-15c6a2be9353-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.767011 4835 generic.go:334] "Generic (PLEG): container finished" podID="9fb64305-2843-4f0d-89d9-15c6a2be9353" containerID="8256d5dcfe5d00bc9dd8dd9c9fc015b6b2acf2347eba9cabfe02d675fd2f0014" exitCode=0 Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.767277 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frjq8" event={"ID":"9fb64305-2843-4f0d-89d9-15c6a2be9353","Type":"ContainerDied","Data":"8256d5dcfe5d00bc9dd8dd9c9fc015b6b2acf2347eba9cabfe02d675fd2f0014"} Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.767390 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frjq8" event={"ID":"9fb64305-2843-4f0d-89d9-15c6a2be9353","Type":"ContainerDied","Data":"de61510676af3421c2ff1b69ac28af89e98b47ad0497553f6ad3832461dc60df"} Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.767488 4835 scope.go:117] "RemoveContainer" containerID="8256d5dcfe5d00bc9dd8dd9c9fc015b6b2acf2347eba9cabfe02d675fd2f0014" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.767751 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frjq8" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.785087 4835 scope.go:117] "RemoveContainer" containerID="b4b9d9b029dcfb24723f3a5681a20772b694fec3244b54393e559d02fa61667a" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.801023 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-frjq8"] Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.802560 4835 scope.go:117] "RemoveContainer" containerID="cdf2437c402de25c396f95f6378e8257b6d7e64b9e3358a0be17ca7699abc99a" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.804237 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-frjq8"] Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.815306 4835 scope.go:117] "RemoveContainer" containerID="8256d5dcfe5d00bc9dd8dd9c9fc015b6b2acf2347eba9cabfe02d675fd2f0014" Oct 03 18:17:59 crc kubenswrapper[4835]: E1003 18:17:59.815615 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8256d5dcfe5d00bc9dd8dd9c9fc015b6b2acf2347eba9cabfe02d675fd2f0014\": container with ID starting with 8256d5dcfe5d00bc9dd8dd9c9fc015b6b2acf2347eba9cabfe02d675fd2f0014 not found: ID does not exist" containerID="8256d5dcfe5d00bc9dd8dd9c9fc015b6b2acf2347eba9cabfe02d675fd2f0014" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.815645 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8256d5dcfe5d00bc9dd8dd9c9fc015b6b2acf2347eba9cabfe02d675fd2f0014"} err="failed to get container status \"8256d5dcfe5d00bc9dd8dd9c9fc015b6b2acf2347eba9cabfe02d675fd2f0014\": rpc error: code = NotFound desc = could not find container \"8256d5dcfe5d00bc9dd8dd9c9fc015b6b2acf2347eba9cabfe02d675fd2f0014\": container with ID starting with 8256d5dcfe5d00bc9dd8dd9c9fc015b6b2acf2347eba9cabfe02d675fd2f0014 not found: ID does not exist" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.815669 4835 scope.go:117] "RemoveContainer" containerID="b4b9d9b029dcfb24723f3a5681a20772b694fec3244b54393e559d02fa61667a" Oct 03 18:17:59 crc kubenswrapper[4835]: E1003 18:17:59.815951 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4b9d9b029dcfb24723f3a5681a20772b694fec3244b54393e559d02fa61667a\": container with ID starting with b4b9d9b029dcfb24723f3a5681a20772b694fec3244b54393e559d02fa61667a not found: ID does not exist" containerID="b4b9d9b029dcfb24723f3a5681a20772b694fec3244b54393e559d02fa61667a" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.815976 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b9d9b029dcfb24723f3a5681a20772b694fec3244b54393e559d02fa61667a"} err="failed to get container status \"b4b9d9b029dcfb24723f3a5681a20772b694fec3244b54393e559d02fa61667a\": rpc error: code = NotFound desc = could not find container \"b4b9d9b029dcfb24723f3a5681a20772b694fec3244b54393e559d02fa61667a\": container with ID starting with b4b9d9b029dcfb24723f3a5681a20772b694fec3244b54393e559d02fa61667a not found: ID does not exist" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.815992 4835 scope.go:117] "RemoveContainer" containerID="cdf2437c402de25c396f95f6378e8257b6d7e64b9e3358a0be17ca7699abc99a" Oct 03 18:17:59 crc kubenswrapper[4835]: E1003 18:17:59.816324 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdf2437c402de25c396f95f6378e8257b6d7e64b9e3358a0be17ca7699abc99a\": container with ID starting with cdf2437c402de25c396f95f6378e8257b6d7e64b9e3358a0be17ca7699abc99a not found: ID does not exist" containerID="cdf2437c402de25c396f95f6378e8257b6d7e64b9e3358a0be17ca7699abc99a" Oct 03 18:17:59 crc kubenswrapper[4835]: I1003 18:17:59.816350 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdf2437c402de25c396f95f6378e8257b6d7e64b9e3358a0be17ca7699abc99a"} err="failed to get container status \"cdf2437c402de25c396f95f6378e8257b6d7e64b9e3358a0be17ca7699abc99a\": rpc error: code = NotFound desc = could not find container \"cdf2437c402de25c396f95f6378e8257b6d7e64b9e3358a0be17ca7699abc99a\": container with ID starting with cdf2437c402de25c396f95f6378e8257b6d7e64b9e3358a0be17ca7699abc99a not found: ID does not exist" Oct 03 18:18:00 crc kubenswrapper[4835]: I1003 18:18:00.882983 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fb64305-2843-4f0d-89d9-15c6a2be9353" path="/var/lib/kubelet/pods/9fb64305-2843-4f0d-89d9-15c6a2be9353/volumes" Oct 03 18:18:05 crc kubenswrapper[4835]: I1003 18:18:05.358530 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:18:05 crc kubenswrapper[4835]: I1003 18:18:05.358941 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:18:05 crc kubenswrapper[4835]: I1003 18:18:05.358996 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:18:05 crc kubenswrapper[4835]: I1003 18:18:05.359628 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 18:18:05 crc kubenswrapper[4835]: I1003 18:18:05.359697 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf" gracePeriod=600 Oct 03 18:18:05 crc kubenswrapper[4835]: I1003 18:18:05.799096 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf" exitCode=0 Oct 03 18:18:05 crc kubenswrapper[4835]: I1003 18:18:05.799180 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf"} Oct 03 18:18:05 crc kubenswrapper[4835]: I1003 18:18:05.799559 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"33b22db8eab068f3d27b86b574d9d679f3087e01aa6ee7e5483fdafa16b4a8b9"} Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.002738 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" podUID="810a8fd3-d63d-4fd1-b6f1-186457e8878a" containerName="oauth-openshift" containerID="cri-o://b4d11948c331929a29b91df54b914305217cae1238df57d1e4867522893f581c" gracePeriod=15 Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.318689 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354181 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl"] Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354493 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb64305-2843-4f0d-89d9-15c6a2be9353" containerName="extract-utilities" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354514 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb64305-2843-4f0d-89d9-15c6a2be9353" containerName="extract-utilities" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354530 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84574502-e2af-4d0b-83e9-206513b44cbc" containerName="extract-content" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354538 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="84574502-e2af-4d0b-83e9-206513b44cbc" containerName="extract-content" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354551 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84574502-e2af-4d0b-83e9-206513b44cbc" containerName="extract-utilities" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354560 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="84574502-e2af-4d0b-83e9-206513b44cbc" containerName="extract-utilities" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354569 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801fdb9d-0303-46b9-aa79-8c30cda070eb" containerName="registry-server" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354577 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="801fdb9d-0303-46b9-aa79-8c30cda070eb" containerName="registry-server" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354589 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="810a8fd3-d63d-4fd1-b6f1-186457e8878a" containerName="oauth-openshift" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354597 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="810a8fd3-d63d-4fd1-b6f1-186457e8878a" containerName="oauth-openshift" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354610 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" containerName="extract-content" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354618 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" containerName="extract-content" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354628 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" containerName="registry-server" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354636 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" containerName="registry-server" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354644 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801fdb9d-0303-46b9-aa79-8c30cda070eb" containerName="extract-utilities" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354651 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="801fdb9d-0303-46b9-aa79-8c30cda070eb" containerName="extract-utilities" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354720 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb64305-2843-4f0d-89d9-15c6a2be9353" containerName="extract-content" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354731 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb64305-2843-4f0d-89d9-15c6a2be9353" containerName="extract-content" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354741 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84574502-e2af-4d0b-83e9-206513b44cbc" containerName="registry-server" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354749 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="84574502-e2af-4d0b-83e9-206513b44cbc" containerName="registry-server" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354761 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" containerName="extract-utilities" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354769 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" containerName="extract-utilities" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354778 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9880164-05de-4884-81a8-c2607d0865ed" containerName="pruner" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354786 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9880164-05de-4884-81a8-c2607d0865ed" containerName="pruner" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354795 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb64305-2843-4f0d-89d9-15c6a2be9353" containerName="registry-server" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354804 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb64305-2843-4f0d-89d9-15c6a2be9353" containerName="registry-server" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.354817 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801fdb9d-0303-46b9-aa79-8c30cda070eb" containerName="extract-content" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354824 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="801fdb9d-0303-46b9-aa79-8c30cda070eb" containerName="extract-content" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354930 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb64305-2843-4f0d-89d9-15c6a2be9353" containerName="registry-server" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354947 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="84574502-e2af-4d0b-83e9-206513b44cbc" containerName="registry-server" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354960 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="801fdb9d-0303-46b9-aa79-8c30cda070eb" containerName="registry-server" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354969 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9880164-05de-4884-81a8-c2607d0865ed" containerName="pruner" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354980 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="64185dc5-d857-4370-8a3b-f1e5ca460bc0" containerName="registry-server" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.354992 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="810a8fd3-d63d-4fd1-b6f1-186457e8878a" containerName="oauth-openshift" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.355834 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.366643 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl"] Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.454518 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-ocp-branding-template\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.454566 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-service-ca\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.454929 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/810a8fd3-d63d-4fd1-b6f1-186457e8878a-audit-dir\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.454960 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-audit-policies\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.454981 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-session\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.454998 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-router-certs\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455021 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-cliconfig\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455043 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-idp-0-file-data\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455062 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-trusted-ca-bundle\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455101 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-login\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455115 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-error\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455130 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-serving-cert\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455151 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr2n7\" (UniqueName: \"kubernetes.io/projected/810a8fd3-d63d-4fd1-b6f1-186457e8878a-kube-api-access-zr2n7\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455172 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-provider-selection\") pod \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\" (UID: \"810a8fd3-d63d-4fd1-b6f1-186457e8878a\") " Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455238 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-user-template-error\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455281 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-service-ca\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455308 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455327 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455352 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-audit-policies\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455370 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-session\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455390 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455404 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzm9v\" (UniqueName: \"kubernetes.io/projected/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-kube-api-access-qzm9v\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455429 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-router-certs\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455450 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-audit-dir\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455523 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455548 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-user-template-login\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455568 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455589 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455622 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/810a8fd3-d63d-4fd1-b6f1-186457e8878a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.455857 4835 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/810a8fd3-d63d-4fd1-b6f1-186457e8878a-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.456918 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.456959 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.457415 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.458696 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.460486 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.460720 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.460795 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/810a8fd3-d63d-4fd1-b6f1-186457e8878a-kube-api-access-zr2n7" (OuterVolumeSpecName: "kube-api-access-zr2n7") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "kube-api-access-zr2n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.460927 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.461100 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.461514 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.465498 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.466292 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.466487 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "810a8fd3-d63d-4fd1-b6f1-186457e8878a" (UID: "810a8fd3-d63d-4fd1-b6f1-186457e8878a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.556859 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-user-template-error\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.556920 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-service-ca\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.556948 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.556968 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.556990 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-audit-policies\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557006 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-session\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557028 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557045 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzm9v\" (UniqueName: \"kubernetes.io/projected/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-kube-api-access-qzm9v\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557087 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-router-certs\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557158 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-audit-dir\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557184 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557206 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-user-template-login\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557225 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557246 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557287 4835 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557298 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557308 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557317 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557327 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557336 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557370 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557381 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557390 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557399 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr2n7\" (UniqueName: \"kubernetes.io/projected/810a8fd3-d63d-4fd1-b6f1-186457e8878a-kube-api-access-zr2n7\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557410 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557422 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557432 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/810a8fd3-d63d-4fd1-b6f1-186457e8878a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.557397 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-audit-dir\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.558323 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-audit-policies\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.558517 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.559665 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.560281 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-user-template-error\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.560473 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-service-ca\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.560710 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.560756 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.561540 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-session\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.561588 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-router-certs\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.561763 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-user-template-login\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.561991 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.562501 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.572517 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzm9v\" (UniqueName: \"kubernetes.io/projected/4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a-kube-api-access-qzm9v\") pod \"oauth-openshift-7cc79f59b7-pkkkl\" (UID: \"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a\") " pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.680924 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.844122 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl"] Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.883617 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" event={"ID":"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a","Type":"ContainerStarted","Data":"5c8fcaf22d0f0ac158591c7265c6f0d541995bf25d6273e030f35a012da568f1"} Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.884922 4835 generic.go:334] "Generic (PLEG): container finished" podID="810a8fd3-d63d-4fd1-b6f1-186457e8878a" containerID="b4d11948c331929a29b91df54b914305217cae1238df57d1e4867522893f581c" exitCode=0 Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.884972 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.884972 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" event={"ID":"810a8fd3-d63d-4fd1-b6f1-186457e8878a","Type":"ContainerDied","Data":"b4d11948c331929a29b91df54b914305217cae1238df57d1e4867522893f581c"} Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.885212 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-p7wfs" event={"ID":"810a8fd3-d63d-4fd1-b6f1-186457e8878a","Type":"ContainerDied","Data":"8d7f726980351b1158ebaa5d8680efdd1b86ce6a33ee3b3d7717ce3bbe3bcb30"} Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.885233 4835 scope.go:117] "RemoveContainer" containerID="b4d11948c331929a29b91df54b914305217cae1238df57d1e4867522893f581c" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.903755 4835 scope.go:117] "RemoveContainer" containerID="b4d11948c331929a29b91df54b914305217cae1238df57d1e4867522893f581c" Oct 03 18:18:23 crc kubenswrapper[4835]: E1003 18:18:23.904399 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d11948c331929a29b91df54b914305217cae1238df57d1e4867522893f581c\": container with ID starting with b4d11948c331929a29b91df54b914305217cae1238df57d1e4867522893f581c not found: ID does not exist" containerID="b4d11948c331929a29b91df54b914305217cae1238df57d1e4867522893f581c" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.904438 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d11948c331929a29b91df54b914305217cae1238df57d1e4867522893f581c"} err="failed to get container status \"b4d11948c331929a29b91df54b914305217cae1238df57d1e4867522893f581c\": rpc error: code = NotFound desc = could not find container \"b4d11948c331929a29b91df54b914305217cae1238df57d1e4867522893f581c\": container with ID starting with b4d11948c331929a29b91df54b914305217cae1238df57d1e4867522893f581c not found: ID does not exist" Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.914218 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p7wfs"] Oct 03 18:18:23 crc kubenswrapper[4835]: I1003 18:18:23.916974 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-p7wfs"] Oct 03 18:18:24 crc kubenswrapper[4835]: I1003 18:18:24.882461 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="810a8fd3-d63d-4fd1-b6f1-186457e8878a" path="/var/lib/kubelet/pods/810a8fd3-d63d-4fd1-b6f1-186457e8878a/volumes" Oct 03 18:18:24 crc kubenswrapper[4835]: I1003 18:18:24.891214 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" event={"ID":"4fb424bd-3e0f-4d11-ac8e-53c4b926bf6a","Type":"ContainerStarted","Data":"02798f01b8195f793c85aadcfff391845e39a81d741577d3b28ee435b5145e33"} Oct 03 18:18:24 crc kubenswrapper[4835]: I1003 18:18:24.891425 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:24 crc kubenswrapper[4835]: I1003 18:18:24.895901 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" Oct 03 18:18:24 crc kubenswrapper[4835]: I1003 18:18:24.914944 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7cc79f59b7-pkkkl" podStartSLOduration=27.914925076 podStartE2EDuration="27.914925076s" podCreationTimestamp="2025-10-03 18:17:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:18:24.908487547 +0000 UTC m=+246.624428419" watchObservedRunningTime="2025-10-03 18:18:24.914925076 +0000 UTC m=+246.630865958" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.518289 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ndfwz"] Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.520578 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ndfwz" podUID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" containerName="registry-server" containerID="cri-o://ee10c525603d688141ed43bd1e74e4bcff8f2e3e00b7b7669475a92eab4881b6" gracePeriod=30 Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.522799 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xjj46"] Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.522993 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xjj46" podUID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" containerName="registry-server" containerID="cri-o://66c205247c1015e912787eaacf2b023ee5cbc2544074cf330e681b53cd0d4c4f" gracePeriod=30 Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.530372 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pf9vb"] Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.530826 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" podUID="6818d850-0c23-481b-b3f5-fbb31275d97f" containerName="marketplace-operator" containerID="cri-o://3078d612e85d68749cc69955c879ce82cc5ff9c4a5da27ceb32f92b719417091" gracePeriod=30 Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.539606 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrf6l"] Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.540706 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zrf6l" podUID="ed6d7146-dd06-4086-8b38-2140c5deeff9" containerName="registry-server" containerID="cri-o://70ff4885bc0cd2cf0d5f3caf9c0544c4117cf5d70a7ac20854343bbb48a18986" gracePeriod=30 Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.553182 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2lz79"] Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.553954 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.560342 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2lz79"] Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.564618 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tr45z"] Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.564879 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tr45z" podUID="6fc82477-8141-4654-9153-b2a046309e8b" containerName="registry-server" containerID="cri-o://1a21044041fd8a83ac4a2b8d08cf845af1adc75efa16269436b2cbe4f421b5bc" gracePeriod=30 Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.695888 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdkj6\" (UniqueName: \"kubernetes.io/projected/340f6531-0442-4899-87ea-795466615b9b-kube-api-access-gdkj6\") pod \"marketplace-operator-79b997595-2lz79\" (UID: \"340f6531-0442-4899-87ea-795466615b9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.695938 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/340f6531-0442-4899-87ea-795466615b9b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2lz79\" (UID: \"340f6531-0442-4899-87ea-795466615b9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.695980 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/340f6531-0442-4899-87ea-795466615b9b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2lz79\" (UID: \"340f6531-0442-4899-87ea-795466615b9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.797120 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdkj6\" (UniqueName: \"kubernetes.io/projected/340f6531-0442-4899-87ea-795466615b9b-kube-api-access-gdkj6\") pod \"marketplace-operator-79b997595-2lz79\" (UID: \"340f6531-0442-4899-87ea-795466615b9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.797191 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/340f6531-0442-4899-87ea-795466615b9b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2lz79\" (UID: \"340f6531-0442-4899-87ea-795466615b9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.797252 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/340f6531-0442-4899-87ea-795466615b9b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2lz79\" (UID: \"340f6531-0442-4899-87ea-795466615b9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.798711 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/340f6531-0442-4899-87ea-795466615b9b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2lz79\" (UID: \"340f6531-0442-4899-87ea-795466615b9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.804790 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/340f6531-0442-4899-87ea-795466615b9b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2lz79\" (UID: \"340f6531-0442-4899-87ea-795466615b9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.816579 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdkj6\" (UniqueName: \"kubernetes.io/projected/340f6531-0442-4899-87ea-795466615b9b-kube-api-access-gdkj6\") pod \"marketplace-operator-79b997595-2lz79\" (UID: \"340f6531-0442-4899-87ea-795466615b9b\") " pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.878758 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.964165 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.986749 4835 generic.go:334] "Generic (PLEG): container finished" podID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" containerID="ee10c525603d688141ed43bd1e74e4bcff8f2e3e00b7b7669475a92eab4881b6" exitCode=0 Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.986838 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndfwz" event={"ID":"fa6d5cf8-dfde-42f7-9507-48f41bf44b50","Type":"ContainerDied","Data":"ee10c525603d688141ed43bd1e74e4bcff8f2e3e00b7b7669475a92eab4881b6"} Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.986845 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.986875 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ndfwz" event={"ID":"fa6d5cf8-dfde-42f7-9507-48f41bf44b50","Type":"ContainerDied","Data":"eb0fcc4293f2211d002cb1afd001adfece9e9293b30d92a5f32ae3c84ed5ea64"} Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.986894 4835 scope.go:117] "RemoveContainer" containerID="ee10c525603d688141ed43bd1e74e4bcff8f2e3e00b7b7669475a92eab4881b6" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.987032 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ndfwz" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.995983 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.996828 4835 generic.go:334] "Generic (PLEG): container finished" podID="ed6d7146-dd06-4086-8b38-2140c5deeff9" containerID="70ff4885bc0cd2cf0d5f3caf9c0544c4117cf5d70a7ac20854343bbb48a18986" exitCode=0 Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.996925 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrf6l" event={"ID":"ed6d7146-dd06-4086-8b38-2140c5deeff9","Type":"ContainerDied","Data":"70ff4885bc0cd2cf0d5f3caf9c0544c4117cf5d70a7ac20854343bbb48a18986"} Oct 03 18:18:43 crc kubenswrapper[4835]: I1003 18:18:43.996962 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrf6l" event={"ID":"ed6d7146-dd06-4086-8b38-2140c5deeff9","Type":"ContainerDied","Data":"cc95ab232307ac0e8abcf4ad87b4acabacf72c19f355c15b755b54096864984a"} Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.000864 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.001277 4835 generic.go:334] "Generic (PLEG): container finished" podID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" containerID="66c205247c1015e912787eaacf2b023ee5cbc2544074cf330e681b53cd0d4c4f" exitCode=0 Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.001332 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjj46" event={"ID":"9b246bf3-c3d8-41d9-9ae1-660fdc057961","Type":"ContainerDied","Data":"66c205247c1015e912787eaacf2b023ee5cbc2544074cf330e681b53cd0d4c4f"} Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.001358 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjj46" event={"ID":"9b246bf3-c3d8-41d9-9ae1-660fdc057961","Type":"ContainerDied","Data":"d27d82e101ea9483228f5b4e8e3cf6c6d7211cc16b52b1b77eef286f26c81b77"} Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.001511 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjj46" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.014329 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.041951 4835 generic.go:334] "Generic (PLEG): container finished" podID="6fc82477-8141-4654-9153-b2a046309e8b" containerID="1a21044041fd8a83ac4a2b8d08cf845af1adc75efa16269436b2cbe4f421b5bc" exitCode=0 Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.042019 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr45z" event={"ID":"6fc82477-8141-4654-9153-b2a046309e8b","Type":"ContainerDied","Data":"1a21044041fd8a83ac4a2b8d08cf845af1adc75efa16269436b2cbe4f421b5bc"} Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.042047 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr45z" event={"ID":"6fc82477-8141-4654-9153-b2a046309e8b","Type":"ContainerDied","Data":"dea543a28a44983a3c697558ec2c5d0030dfa15c8bf0857b0b3d8b9ac8140a0d"} Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.045034 4835 generic.go:334] "Generic (PLEG): container finished" podID="6818d850-0c23-481b-b3f5-fbb31275d97f" containerID="3078d612e85d68749cc69955c879ce82cc5ff9c4a5da27ceb32f92b719417091" exitCode=0 Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.045060 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" event={"ID":"6818d850-0c23-481b-b3f5-fbb31275d97f","Type":"ContainerDied","Data":"3078d612e85d68749cc69955c879ce82cc5ff9c4a5da27ceb32f92b719417091"} Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.045093 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" event={"ID":"6818d850-0c23-481b-b3f5-fbb31275d97f","Type":"ContainerDied","Data":"ae68ca3c1c972b7e78dabcd3d3cba3c3374507705fb8a959d86e3aa45bfdb627"} Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.045105 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pf9vb" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.058296 4835 scope.go:117] "RemoveContainer" containerID="d6e0faa30ca410cd6997f60a41b76ffbcb3bc8b25eb3aeb4aa5b54212bb62a8e" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.105342 4835 scope.go:117] "RemoveContainer" containerID="fb06301098b71f3f3497d75c607b9be277f6e757c92aec7536c9db6272a24ac0" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.105563 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6d7146-dd06-4086-8b38-2140c5deeff9-utilities\") pod \"ed6d7146-dd06-4086-8b38-2140c5deeff9\" (UID: \"ed6d7146-dd06-4086-8b38-2140c5deeff9\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.105599 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t8sj\" (UniqueName: \"kubernetes.io/projected/9b246bf3-c3d8-41d9-9ae1-660fdc057961-kube-api-access-4t8sj\") pod \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\" (UID: \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.105622 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g9nw\" (UniqueName: \"kubernetes.io/projected/6fc82477-8141-4654-9153-b2a046309e8b-kube-api-access-8g9nw\") pod \"6fc82477-8141-4654-9153-b2a046309e8b\" (UID: \"6fc82477-8141-4654-9153-b2a046309e8b\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.105652 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b246bf3-c3d8-41d9-9ae1-660fdc057961-catalog-content\") pod \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\" (UID: \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.105672 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6d7146-dd06-4086-8b38-2140c5deeff9-catalog-content\") pod \"ed6d7146-dd06-4086-8b38-2140c5deeff9\" (UID: \"ed6d7146-dd06-4086-8b38-2140c5deeff9\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.105692 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-utilities\") pod \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\" (UID: \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.105712 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b246bf3-c3d8-41d9-9ae1-660fdc057961-utilities\") pod \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\" (UID: \"9b246bf3-c3d8-41d9-9ae1-660fdc057961\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.105731 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fc82477-8141-4654-9153-b2a046309e8b-catalog-content\") pod \"6fc82477-8141-4654-9153-b2a046309e8b\" (UID: \"6fc82477-8141-4654-9153-b2a046309e8b\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.105756 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fc82477-8141-4654-9153-b2a046309e8b-utilities\") pod \"6fc82477-8141-4654-9153-b2a046309e8b\" (UID: \"6fc82477-8141-4654-9153-b2a046309e8b\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.105781 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mhtv\" (UniqueName: \"kubernetes.io/projected/ed6d7146-dd06-4086-8b38-2140c5deeff9-kube-api-access-8mhtv\") pod \"ed6d7146-dd06-4086-8b38-2140c5deeff9\" (UID: \"ed6d7146-dd06-4086-8b38-2140c5deeff9\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.105808 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s2nx\" (UniqueName: \"kubernetes.io/projected/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-kube-api-access-4s2nx\") pod \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\" (UID: \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.105864 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-catalog-content\") pod \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\" (UID: \"fa6d5cf8-dfde-42f7-9507-48f41bf44b50\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.115478 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fc82477-8141-4654-9153-b2a046309e8b-kube-api-access-8g9nw" (OuterVolumeSpecName: "kube-api-access-8g9nw") pod "6fc82477-8141-4654-9153-b2a046309e8b" (UID: "6fc82477-8141-4654-9153-b2a046309e8b"). InnerVolumeSpecName "kube-api-access-8g9nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.116744 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fc82477-8141-4654-9153-b2a046309e8b-utilities" (OuterVolumeSpecName: "utilities") pod "6fc82477-8141-4654-9153-b2a046309e8b" (UID: "6fc82477-8141-4654-9153-b2a046309e8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.116768 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed6d7146-dd06-4086-8b38-2140c5deeff9-utilities" (OuterVolumeSpecName: "utilities") pod "ed6d7146-dd06-4086-8b38-2140c5deeff9" (UID: "ed6d7146-dd06-4086-8b38-2140c5deeff9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.116845 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b246bf3-c3d8-41d9-9ae1-660fdc057961-utilities" (OuterVolumeSpecName: "utilities") pod "9b246bf3-c3d8-41d9-9ae1-660fdc057961" (UID: "9b246bf3-c3d8-41d9-9ae1-660fdc057961"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.117214 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-utilities" (OuterVolumeSpecName: "utilities") pod "fa6d5cf8-dfde-42f7-9507-48f41bf44b50" (UID: "fa6d5cf8-dfde-42f7-9507-48f41bf44b50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.121548 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed6d7146-dd06-4086-8b38-2140c5deeff9-kube-api-access-8mhtv" (OuterVolumeSpecName: "kube-api-access-8mhtv") pod "ed6d7146-dd06-4086-8b38-2140c5deeff9" (UID: "ed6d7146-dd06-4086-8b38-2140c5deeff9"). InnerVolumeSpecName "kube-api-access-8mhtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.116059 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b246bf3-c3d8-41d9-9ae1-660fdc057961-kube-api-access-4t8sj" (OuterVolumeSpecName: "kube-api-access-4t8sj") pod "9b246bf3-c3d8-41d9-9ae1-660fdc057961" (UID: "9b246bf3-c3d8-41d9-9ae1-660fdc057961"). InnerVolumeSpecName "kube-api-access-4t8sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.122262 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-kube-api-access-4s2nx" (OuterVolumeSpecName: "kube-api-access-4s2nx") pod "fa6d5cf8-dfde-42f7-9507-48f41bf44b50" (UID: "fa6d5cf8-dfde-42f7-9507-48f41bf44b50"). InnerVolumeSpecName "kube-api-access-4s2nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.146971 4835 scope.go:117] "RemoveContainer" containerID="ee10c525603d688141ed43bd1e74e4bcff8f2e3e00b7b7669475a92eab4881b6" Oct 03 18:18:44 crc kubenswrapper[4835]: E1003 18:18:44.147454 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee10c525603d688141ed43bd1e74e4bcff8f2e3e00b7b7669475a92eab4881b6\": container with ID starting with ee10c525603d688141ed43bd1e74e4bcff8f2e3e00b7b7669475a92eab4881b6 not found: ID does not exist" containerID="ee10c525603d688141ed43bd1e74e4bcff8f2e3e00b7b7669475a92eab4881b6" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.147484 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee10c525603d688141ed43bd1e74e4bcff8f2e3e00b7b7669475a92eab4881b6"} err="failed to get container status \"ee10c525603d688141ed43bd1e74e4bcff8f2e3e00b7b7669475a92eab4881b6\": rpc error: code = NotFound desc = could not find container \"ee10c525603d688141ed43bd1e74e4bcff8f2e3e00b7b7669475a92eab4881b6\": container with ID starting with ee10c525603d688141ed43bd1e74e4bcff8f2e3e00b7b7669475a92eab4881b6 not found: ID does not exist" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.147504 4835 scope.go:117] "RemoveContainer" containerID="d6e0faa30ca410cd6997f60a41b76ffbcb3bc8b25eb3aeb4aa5b54212bb62a8e" Oct 03 18:18:44 crc kubenswrapper[4835]: E1003 18:18:44.148470 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e0faa30ca410cd6997f60a41b76ffbcb3bc8b25eb3aeb4aa5b54212bb62a8e\": container with ID starting with d6e0faa30ca410cd6997f60a41b76ffbcb3bc8b25eb3aeb4aa5b54212bb62a8e not found: ID does not exist" containerID="d6e0faa30ca410cd6997f60a41b76ffbcb3bc8b25eb3aeb4aa5b54212bb62a8e" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.148523 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e0faa30ca410cd6997f60a41b76ffbcb3bc8b25eb3aeb4aa5b54212bb62a8e"} err="failed to get container status \"d6e0faa30ca410cd6997f60a41b76ffbcb3bc8b25eb3aeb4aa5b54212bb62a8e\": rpc error: code = NotFound desc = could not find container \"d6e0faa30ca410cd6997f60a41b76ffbcb3bc8b25eb3aeb4aa5b54212bb62a8e\": container with ID starting with d6e0faa30ca410cd6997f60a41b76ffbcb3bc8b25eb3aeb4aa5b54212bb62a8e not found: ID does not exist" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.148537 4835 scope.go:117] "RemoveContainer" containerID="fb06301098b71f3f3497d75c607b9be277f6e757c92aec7536c9db6272a24ac0" Oct 03 18:18:44 crc kubenswrapper[4835]: E1003 18:18:44.148972 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb06301098b71f3f3497d75c607b9be277f6e757c92aec7536c9db6272a24ac0\": container with ID starting with fb06301098b71f3f3497d75c607b9be277f6e757c92aec7536c9db6272a24ac0 not found: ID does not exist" containerID="fb06301098b71f3f3497d75c607b9be277f6e757c92aec7536c9db6272a24ac0" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.149026 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb06301098b71f3f3497d75c607b9be277f6e757c92aec7536c9db6272a24ac0"} err="failed to get container status \"fb06301098b71f3f3497d75c607b9be277f6e757c92aec7536c9db6272a24ac0\": rpc error: code = NotFound desc = could not find container \"fb06301098b71f3f3497d75c607b9be277f6e757c92aec7536c9db6272a24ac0\": container with ID starting with fb06301098b71f3f3497d75c607b9be277f6e757c92aec7536c9db6272a24ac0 not found: ID does not exist" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.149041 4835 scope.go:117] "RemoveContainer" containerID="70ff4885bc0cd2cf0d5f3caf9c0544c4117cf5d70a7ac20854343bbb48a18986" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.153385 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed6d7146-dd06-4086-8b38-2140c5deeff9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed6d7146-dd06-4086-8b38-2140c5deeff9" (UID: "ed6d7146-dd06-4086-8b38-2140c5deeff9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.177269 4835 scope.go:117] "RemoveContainer" containerID="83a08a436c027e500ff2159dcd4f23b67ebdd325eca5c9e7ecf14e3fa77a2344" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.178342 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2lz79"] Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.187170 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa6d5cf8-dfde-42f7-9507-48f41bf44b50" (UID: "fa6d5cf8-dfde-42f7-9507-48f41bf44b50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.193329 4835 scope.go:117] "RemoveContainer" containerID="d544a375ca5127e76b8115106ca6a918427049fb18bc9690957b666da42f7bbc" Oct 03 18:18:44 crc kubenswrapper[4835]: W1003 18:18:44.197285 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod340f6531_0442_4899_87ea_795466615b9b.slice/crio-ef266e2647c9575d763f030ad45641357131c1338d2c68bad3811e3e62439859 WatchSource:0}: Error finding container ef266e2647c9575d763f030ad45641357131c1338d2c68bad3811e3e62439859: Status 404 returned error can't find the container with id ef266e2647c9575d763f030ad45641357131c1338d2c68bad3811e3e62439859 Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.200795 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b246bf3-c3d8-41d9-9ae1-660fdc057961-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b246bf3-c3d8-41d9-9ae1-660fdc057961" (UID: "9b246bf3-c3d8-41d9-9ae1-660fdc057961"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207344 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6818d850-0c23-481b-b3f5-fbb31275d97f-marketplace-trusted-ca\") pod \"6818d850-0c23-481b-b3f5-fbb31275d97f\" (UID: \"6818d850-0c23-481b-b3f5-fbb31275d97f\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207383 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bfn2\" (UniqueName: \"kubernetes.io/projected/6818d850-0c23-481b-b3f5-fbb31275d97f-kube-api-access-2bfn2\") pod \"6818d850-0c23-481b-b3f5-fbb31275d97f\" (UID: \"6818d850-0c23-481b-b3f5-fbb31275d97f\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207422 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6818d850-0c23-481b-b3f5-fbb31275d97f-marketplace-operator-metrics\") pod \"6818d850-0c23-481b-b3f5-fbb31275d97f\" (UID: \"6818d850-0c23-481b-b3f5-fbb31275d97f\") " Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207669 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207684 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6d7146-dd06-4086-8b38-2140c5deeff9-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207693 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t8sj\" (UniqueName: \"kubernetes.io/projected/9b246bf3-c3d8-41d9-9ae1-660fdc057961-kube-api-access-4t8sj\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207704 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g9nw\" (UniqueName: \"kubernetes.io/projected/6fc82477-8141-4654-9153-b2a046309e8b-kube-api-access-8g9nw\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207713 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b246bf3-c3d8-41d9-9ae1-660fdc057961-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207722 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6d7146-dd06-4086-8b38-2140c5deeff9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207730 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207740 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b246bf3-c3d8-41d9-9ae1-660fdc057961-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207748 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fc82477-8141-4654-9153-b2a046309e8b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207756 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mhtv\" (UniqueName: \"kubernetes.io/projected/ed6d7146-dd06-4086-8b38-2140c5deeff9-kube-api-access-8mhtv\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.207765 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s2nx\" (UniqueName: \"kubernetes.io/projected/fa6d5cf8-dfde-42f7-9507-48f41bf44b50-kube-api-access-4s2nx\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.208578 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6818d850-0c23-481b-b3f5-fbb31275d97f-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6818d850-0c23-481b-b3f5-fbb31275d97f" (UID: "6818d850-0c23-481b-b3f5-fbb31275d97f"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.211563 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6818d850-0c23-481b-b3f5-fbb31275d97f-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6818d850-0c23-481b-b3f5-fbb31275d97f" (UID: "6818d850-0c23-481b-b3f5-fbb31275d97f"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.212937 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6818d850-0c23-481b-b3f5-fbb31275d97f-kube-api-access-2bfn2" (OuterVolumeSpecName: "kube-api-access-2bfn2") pod "6818d850-0c23-481b-b3f5-fbb31275d97f" (UID: "6818d850-0c23-481b-b3f5-fbb31275d97f"). InnerVolumeSpecName "kube-api-access-2bfn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.217247 4835 scope.go:117] "RemoveContainer" containerID="70ff4885bc0cd2cf0d5f3caf9c0544c4117cf5d70a7ac20854343bbb48a18986" Oct 03 18:18:44 crc kubenswrapper[4835]: E1003 18:18:44.217874 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ff4885bc0cd2cf0d5f3caf9c0544c4117cf5d70a7ac20854343bbb48a18986\": container with ID starting with 70ff4885bc0cd2cf0d5f3caf9c0544c4117cf5d70a7ac20854343bbb48a18986 not found: ID does not exist" containerID="70ff4885bc0cd2cf0d5f3caf9c0544c4117cf5d70a7ac20854343bbb48a18986" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.217963 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ff4885bc0cd2cf0d5f3caf9c0544c4117cf5d70a7ac20854343bbb48a18986"} err="failed to get container status \"70ff4885bc0cd2cf0d5f3caf9c0544c4117cf5d70a7ac20854343bbb48a18986\": rpc error: code = NotFound desc = could not find container \"70ff4885bc0cd2cf0d5f3caf9c0544c4117cf5d70a7ac20854343bbb48a18986\": container with ID starting with 70ff4885bc0cd2cf0d5f3caf9c0544c4117cf5d70a7ac20854343bbb48a18986 not found: ID does not exist" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.218048 4835 scope.go:117] "RemoveContainer" containerID="83a08a436c027e500ff2159dcd4f23b67ebdd325eca5c9e7ecf14e3fa77a2344" Oct 03 18:18:44 crc kubenswrapper[4835]: E1003 18:18:44.218509 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83a08a436c027e500ff2159dcd4f23b67ebdd325eca5c9e7ecf14e3fa77a2344\": container with ID starting with 83a08a436c027e500ff2159dcd4f23b67ebdd325eca5c9e7ecf14e3fa77a2344 not found: ID does not exist" containerID="83a08a436c027e500ff2159dcd4f23b67ebdd325eca5c9e7ecf14e3fa77a2344" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.218565 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a08a436c027e500ff2159dcd4f23b67ebdd325eca5c9e7ecf14e3fa77a2344"} err="failed to get container status \"83a08a436c027e500ff2159dcd4f23b67ebdd325eca5c9e7ecf14e3fa77a2344\": rpc error: code = NotFound desc = could not find container \"83a08a436c027e500ff2159dcd4f23b67ebdd325eca5c9e7ecf14e3fa77a2344\": container with ID starting with 83a08a436c027e500ff2159dcd4f23b67ebdd325eca5c9e7ecf14e3fa77a2344 not found: ID does not exist" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.218608 4835 scope.go:117] "RemoveContainer" containerID="d544a375ca5127e76b8115106ca6a918427049fb18bc9690957b666da42f7bbc" Oct 03 18:18:44 crc kubenswrapper[4835]: E1003 18:18:44.218891 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d544a375ca5127e76b8115106ca6a918427049fb18bc9690957b666da42f7bbc\": container with ID starting with d544a375ca5127e76b8115106ca6a918427049fb18bc9690957b666da42f7bbc not found: ID does not exist" containerID="d544a375ca5127e76b8115106ca6a918427049fb18bc9690957b666da42f7bbc" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.218972 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d544a375ca5127e76b8115106ca6a918427049fb18bc9690957b666da42f7bbc"} err="failed to get container status \"d544a375ca5127e76b8115106ca6a918427049fb18bc9690957b666da42f7bbc\": rpc error: code = NotFound desc = could not find container \"d544a375ca5127e76b8115106ca6a918427049fb18bc9690957b666da42f7bbc\": container with ID starting with d544a375ca5127e76b8115106ca6a918427049fb18bc9690957b666da42f7bbc not found: ID does not exist" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.219056 4835 scope.go:117] "RemoveContainer" containerID="66c205247c1015e912787eaacf2b023ee5cbc2544074cf330e681b53cd0d4c4f" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.236480 4835 scope.go:117] "RemoveContainer" containerID="84e65d368e36f01bd7ce2a66235690b063a8c5e49324c4f6d01417438728ac64" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.256311 4835 scope.go:117] "RemoveContainer" containerID="40ca00b28223b3725170500ec9be054191e082f53f3c428d156bf5efc391899a" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.274031 4835 scope.go:117] "RemoveContainer" containerID="66c205247c1015e912787eaacf2b023ee5cbc2544074cf330e681b53cd0d4c4f" Oct 03 18:18:44 crc kubenswrapper[4835]: E1003 18:18:44.274794 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c205247c1015e912787eaacf2b023ee5cbc2544074cf330e681b53cd0d4c4f\": container with ID starting with 66c205247c1015e912787eaacf2b023ee5cbc2544074cf330e681b53cd0d4c4f not found: ID does not exist" containerID="66c205247c1015e912787eaacf2b023ee5cbc2544074cf330e681b53cd0d4c4f" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.275402 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c205247c1015e912787eaacf2b023ee5cbc2544074cf330e681b53cd0d4c4f"} err="failed to get container status \"66c205247c1015e912787eaacf2b023ee5cbc2544074cf330e681b53cd0d4c4f\": rpc error: code = NotFound desc = could not find container \"66c205247c1015e912787eaacf2b023ee5cbc2544074cf330e681b53cd0d4c4f\": container with ID starting with 66c205247c1015e912787eaacf2b023ee5cbc2544074cf330e681b53cd0d4c4f not found: ID does not exist" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.275443 4835 scope.go:117] "RemoveContainer" containerID="84e65d368e36f01bd7ce2a66235690b063a8c5e49324c4f6d01417438728ac64" Oct 03 18:18:44 crc kubenswrapper[4835]: E1003 18:18:44.276031 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e65d368e36f01bd7ce2a66235690b063a8c5e49324c4f6d01417438728ac64\": container with ID starting with 84e65d368e36f01bd7ce2a66235690b063a8c5e49324c4f6d01417438728ac64 not found: ID does not exist" containerID="84e65d368e36f01bd7ce2a66235690b063a8c5e49324c4f6d01417438728ac64" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.276178 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e65d368e36f01bd7ce2a66235690b063a8c5e49324c4f6d01417438728ac64"} err="failed to get container status \"84e65d368e36f01bd7ce2a66235690b063a8c5e49324c4f6d01417438728ac64\": rpc error: code = NotFound desc = could not find container \"84e65d368e36f01bd7ce2a66235690b063a8c5e49324c4f6d01417438728ac64\": container with ID starting with 84e65d368e36f01bd7ce2a66235690b063a8c5e49324c4f6d01417438728ac64 not found: ID does not exist" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.276281 4835 scope.go:117] "RemoveContainer" containerID="40ca00b28223b3725170500ec9be054191e082f53f3c428d156bf5efc391899a" Oct 03 18:18:44 crc kubenswrapper[4835]: E1003 18:18:44.276607 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40ca00b28223b3725170500ec9be054191e082f53f3c428d156bf5efc391899a\": container with ID starting with 40ca00b28223b3725170500ec9be054191e082f53f3c428d156bf5efc391899a not found: ID does not exist" containerID="40ca00b28223b3725170500ec9be054191e082f53f3c428d156bf5efc391899a" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.276711 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ca00b28223b3725170500ec9be054191e082f53f3c428d156bf5efc391899a"} err="failed to get container status \"40ca00b28223b3725170500ec9be054191e082f53f3c428d156bf5efc391899a\": rpc error: code = NotFound desc = could not find container \"40ca00b28223b3725170500ec9be054191e082f53f3c428d156bf5efc391899a\": container with ID starting with 40ca00b28223b3725170500ec9be054191e082f53f3c428d156bf5efc391899a not found: ID does not exist" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.276807 4835 scope.go:117] "RemoveContainer" containerID="1a21044041fd8a83ac4a2b8d08cf845af1adc75efa16269436b2cbe4f421b5bc" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.287545 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fc82477-8141-4654-9153-b2a046309e8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fc82477-8141-4654-9153-b2a046309e8b" (UID: "6fc82477-8141-4654-9153-b2a046309e8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.310216 4835 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6818d850-0c23-481b-b3f5-fbb31275d97f-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.310252 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fc82477-8141-4654-9153-b2a046309e8b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.310264 4835 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6818d850-0c23-481b-b3f5-fbb31275d97f-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.310280 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bfn2\" (UniqueName: \"kubernetes.io/projected/6818d850-0c23-481b-b3f5-fbb31275d97f-kube-api-access-2bfn2\") on node \"crc\" DevicePath \"\"" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.315777 4835 scope.go:117] "RemoveContainer" containerID="0c4f7bdaf5ccec3e9db60f01a7dd1ef3f1e49b205ac3de94f127c7fd91f2ae4d" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.336323 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ndfwz"] Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.338390 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ndfwz"] Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.343006 4835 scope.go:117] "RemoveContainer" containerID="47366d00d368cbcbdf0ffa673294f1d624d4c08c5df95455638b43ed42dfcfd9" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.354646 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xjj46"] Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.357760 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xjj46"] Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.364669 4835 scope.go:117] "RemoveContainer" containerID="1a21044041fd8a83ac4a2b8d08cf845af1adc75efa16269436b2cbe4f421b5bc" Oct 03 18:18:44 crc kubenswrapper[4835]: E1003 18:18:44.365488 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a21044041fd8a83ac4a2b8d08cf845af1adc75efa16269436b2cbe4f421b5bc\": container with ID starting with 1a21044041fd8a83ac4a2b8d08cf845af1adc75efa16269436b2cbe4f421b5bc not found: ID does not exist" containerID="1a21044041fd8a83ac4a2b8d08cf845af1adc75efa16269436b2cbe4f421b5bc" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.365563 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a21044041fd8a83ac4a2b8d08cf845af1adc75efa16269436b2cbe4f421b5bc"} err="failed to get container status \"1a21044041fd8a83ac4a2b8d08cf845af1adc75efa16269436b2cbe4f421b5bc\": rpc error: code = NotFound desc = could not find container \"1a21044041fd8a83ac4a2b8d08cf845af1adc75efa16269436b2cbe4f421b5bc\": container with ID starting with 1a21044041fd8a83ac4a2b8d08cf845af1adc75efa16269436b2cbe4f421b5bc not found: ID does not exist" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.365594 4835 scope.go:117] "RemoveContainer" containerID="0c4f7bdaf5ccec3e9db60f01a7dd1ef3f1e49b205ac3de94f127c7fd91f2ae4d" Oct 03 18:18:44 crc kubenswrapper[4835]: E1003 18:18:44.366264 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c4f7bdaf5ccec3e9db60f01a7dd1ef3f1e49b205ac3de94f127c7fd91f2ae4d\": container with ID starting with 0c4f7bdaf5ccec3e9db60f01a7dd1ef3f1e49b205ac3de94f127c7fd91f2ae4d not found: ID does not exist" containerID="0c4f7bdaf5ccec3e9db60f01a7dd1ef3f1e49b205ac3de94f127c7fd91f2ae4d" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.366360 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c4f7bdaf5ccec3e9db60f01a7dd1ef3f1e49b205ac3de94f127c7fd91f2ae4d"} err="failed to get container status \"0c4f7bdaf5ccec3e9db60f01a7dd1ef3f1e49b205ac3de94f127c7fd91f2ae4d\": rpc error: code = NotFound desc = could not find container \"0c4f7bdaf5ccec3e9db60f01a7dd1ef3f1e49b205ac3de94f127c7fd91f2ae4d\": container with ID starting with 0c4f7bdaf5ccec3e9db60f01a7dd1ef3f1e49b205ac3de94f127c7fd91f2ae4d not found: ID does not exist" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.366399 4835 scope.go:117] "RemoveContainer" containerID="47366d00d368cbcbdf0ffa673294f1d624d4c08c5df95455638b43ed42dfcfd9" Oct 03 18:18:44 crc kubenswrapper[4835]: E1003 18:18:44.367394 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47366d00d368cbcbdf0ffa673294f1d624d4c08c5df95455638b43ed42dfcfd9\": container with ID starting with 47366d00d368cbcbdf0ffa673294f1d624d4c08c5df95455638b43ed42dfcfd9 not found: ID does not exist" containerID="47366d00d368cbcbdf0ffa673294f1d624d4c08c5df95455638b43ed42dfcfd9" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.367429 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47366d00d368cbcbdf0ffa673294f1d624d4c08c5df95455638b43ed42dfcfd9"} err="failed to get container status \"47366d00d368cbcbdf0ffa673294f1d624d4c08c5df95455638b43ed42dfcfd9\": rpc error: code = NotFound desc = could not find container \"47366d00d368cbcbdf0ffa673294f1d624d4c08c5df95455638b43ed42dfcfd9\": container with ID starting with 47366d00d368cbcbdf0ffa673294f1d624d4c08c5df95455638b43ed42dfcfd9 not found: ID does not exist" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.367447 4835 scope.go:117] "RemoveContainer" containerID="3078d612e85d68749cc69955c879ce82cc5ff9c4a5da27ceb32f92b719417091" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.370896 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pf9vb"] Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.373086 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pf9vb"] Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.379458 4835 scope.go:117] "RemoveContainer" containerID="3078d612e85d68749cc69955c879ce82cc5ff9c4a5da27ceb32f92b719417091" Oct 03 18:18:44 crc kubenswrapper[4835]: E1003 18:18:44.380598 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3078d612e85d68749cc69955c879ce82cc5ff9c4a5da27ceb32f92b719417091\": container with ID starting with 3078d612e85d68749cc69955c879ce82cc5ff9c4a5da27ceb32f92b719417091 not found: ID does not exist" containerID="3078d612e85d68749cc69955c879ce82cc5ff9c4a5da27ceb32f92b719417091" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.380631 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3078d612e85d68749cc69955c879ce82cc5ff9c4a5da27ceb32f92b719417091"} err="failed to get container status \"3078d612e85d68749cc69955c879ce82cc5ff9c4a5da27ceb32f92b719417091\": rpc error: code = NotFound desc = could not find container \"3078d612e85d68749cc69955c879ce82cc5ff9c4a5da27ceb32f92b719417091\": container with ID starting with 3078d612e85d68749cc69955c879ce82cc5ff9c4a5da27ceb32f92b719417091 not found: ID does not exist" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.882638 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6818d850-0c23-481b-b3f5-fbb31275d97f" path="/var/lib/kubelet/pods/6818d850-0c23-481b-b3f5-fbb31275d97f/volumes" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.883118 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" path="/var/lib/kubelet/pods/9b246bf3-c3d8-41d9-9ae1-660fdc057961/volumes" Oct 03 18:18:44 crc kubenswrapper[4835]: I1003 18:18:44.883683 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" path="/var/lib/kubelet/pods/fa6d5cf8-dfde-42f7-9507-48f41bf44b50/volumes" Oct 03 18:18:45 crc kubenswrapper[4835]: I1003 18:18:45.051465 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tr45z" Oct 03 18:18:45 crc kubenswrapper[4835]: I1003 18:18:45.055265 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrf6l" Oct 03 18:18:45 crc kubenswrapper[4835]: I1003 18:18:45.056246 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" event={"ID":"340f6531-0442-4899-87ea-795466615b9b","Type":"ContainerStarted","Data":"ed0dd1f4615623599b1e91a5a3b7586871a2e5d34f249890ba42503f7c39a4ab"} Oct 03 18:18:45 crc kubenswrapper[4835]: I1003 18:18:45.056279 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" event={"ID":"340f6531-0442-4899-87ea-795466615b9b","Type":"ContainerStarted","Data":"ef266e2647c9575d763f030ad45641357131c1338d2c68bad3811e3e62439859"} Oct 03 18:18:45 crc kubenswrapper[4835]: I1003 18:18:45.056436 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" Oct 03 18:18:45 crc kubenswrapper[4835]: I1003 18:18:45.062426 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" Oct 03 18:18:45 crc kubenswrapper[4835]: I1003 18:18:45.068371 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tr45z"] Oct 03 18:18:45 crc kubenswrapper[4835]: I1003 18:18:45.071893 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tr45z"] Oct 03 18:18:45 crc kubenswrapper[4835]: I1003 18:18:45.082157 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrf6l"] Oct 03 18:18:45 crc kubenswrapper[4835]: I1003 18:18:45.086179 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrf6l"] Oct 03 18:18:45 crc kubenswrapper[4835]: I1003 18:18:45.099936 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2lz79" podStartSLOduration=2.099911246 podStartE2EDuration="2.099911246s" podCreationTimestamp="2025-10-03 18:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:18:45.097781478 +0000 UTC m=+266.813722360" watchObservedRunningTime="2025-10-03 18:18:45.099911246 +0000 UTC m=+266.815852118" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332105 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lfd5t"] Oct 03 18:18:46 crc kubenswrapper[4835]: E1003 18:18:46.332589 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" containerName="registry-server" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332602 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" containerName="registry-server" Oct 03 18:18:46 crc kubenswrapper[4835]: E1003 18:18:46.332610 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" containerName="extract-content" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332616 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" containerName="extract-content" Oct 03 18:18:46 crc kubenswrapper[4835]: E1003 18:18:46.332625 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc82477-8141-4654-9153-b2a046309e8b" containerName="registry-server" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332632 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc82477-8141-4654-9153-b2a046309e8b" containerName="registry-server" Oct 03 18:18:46 crc kubenswrapper[4835]: E1003 18:18:46.332644 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" containerName="extract-utilities" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332650 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" containerName="extract-utilities" Oct 03 18:18:46 crc kubenswrapper[4835]: E1003 18:18:46.332658 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" containerName="extract-content" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332663 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" containerName="extract-content" Oct 03 18:18:46 crc kubenswrapper[4835]: E1003 18:18:46.332673 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6d7146-dd06-4086-8b38-2140c5deeff9" containerName="extract-utilities" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332679 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6d7146-dd06-4086-8b38-2140c5deeff9" containerName="extract-utilities" Oct 03 18:18:46 crc kubenswrapper[4835]: E1003 18:18:46.332687 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc82477-8141-4654-9153-b2a046309e8b" containerName="extract-content" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332694 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc82477-8141-4654-9153-b2a046309e8b" containerName="extract-content" Oct 03 18:18:46 crc kubenswrapper[4835]: E1003 18:18:46.332703 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" containerName="extract-utilities" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332709 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" containerName="extract-utilities" Oct 03 18:18:46 crc kubenswrapper[4835]: E1003 18:18:46.332724 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6d7146-dd06-4086-8b38-2140c5deeff9" containerName="registry-server" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332731 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6d7146-dd06-4086-8b38-2140c5deeff9" containerName="registry-server" Oct 03 18:18:46 crc kubenswrapper[4835]: E1003 18:18:46.332739 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc82477-8141-4654-9153-b2a046309e8b" containerName="extract-utilities" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332744 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc82477-8141-4654-9153-b2a046309e8b" containerName="extract-utilities" Oct 03 18:18:46 crc kubenswrapper[4835]: E1003 18:18:46.332751 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6818d850-0c23-481b-b3f5-fbb31275d97f" containerName="marketplace-operator" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332757 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6818d850-0c23-481b-b3f5-fbb31275d97f" containerName="marketplace-operator" Oct 03 18:18:46 crc kubenswrapper[4835]: E1003 18:18:46.332763 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6d7146-dd06-4086-8b38-2140c5deeff9" containerName="extract-content" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332768 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6d7146-dd06-4086-8b38-2140c5deeff9" containerName="extract-content" Oct 03 18:18:46 crc kubenswrapper[4835]: E1003 18:18:46.332775 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" containerName="registry-server" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332780 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" containerName="registry-server" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332861 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6d7146-dd06-4086-8b38-2140c5deeff9" containerName="registry-server" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332877 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6818d850-0c23-481b-b3f5-fbb31275d97f" containerName="marketplace-operator" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332888 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6d5cf8-dfde-42f7-9507-48f41bf44b50" containerName="registry-server" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332894 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b246bf3-c3d8-41d9-9ae1-660fdc057961" containerName="registry-server" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.332901 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fc82477-8141-4654-9153-b2a046309e8b" containerName="registry-server" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.333548 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.336032 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.343426 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lfd5t"] Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.437136 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c615419b-72c3-48d9-91b0-918dc3215104-catalog-content\") pod \"certified-operators-lfd5t\" (UID: \"c615419b-72c3-48d9-91b0-918dc3215104\") " pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.437272 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c615419b-72c3-48d9-91b0-918dc3215104-utilities\") pod \"certified-operators-lfd5t\" (UID: \"c615419b-72c3-48d9-91b0-918dc3215104\") " pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.437294 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk46m\" (UniqueName: \"kubernetes.io/projected/c615419b-72c3-48d9-91b0-918dc3215104-kube-api-access-wk46m\") pod \"certified-operators-lfd5t\" (UID: \"c615419b-72c3-48d9-91b0-918dc3215104\") " pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.538535 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c615419b-72c3-48d9-91b0-918dc3215104-utilities\") pod \"certified-operators-lfd5t\" (UID: \"c615419b-72c3-48d9-91b0-918dc3215104\") " pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.538582 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk46m\" (UniqueName: \"kubernetes.io/projected/c615419b-72c3-48d9-91b0-918dc3215104-kube-api-access-wk46m\") pod \"certified-operators-lfd5t\" (UID: \"c615419b-72c3-48d9-91b0-918dc3215104\") " pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.538654 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c615419b-72c3-48d9-91b0-918dc3215104-catalog-content\") pod \"certified-operators-lfd5t\" (UID: \"c615419b-72c3-48d9-91b0-918dc3215104\") " pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.539205 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c615419b-72c3-48d9-91b0-918dc3215104-utilities\") pod \"certified-operators-lfd5t\" (UID: \"c615419b-72c3-48d9-91b0-918dc3215104\") " pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.539247 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c615419b-72c3-48d9-91b0-918dc3215104-catalog-content\") pod \"certified-operators-lfd5t\" (UID: \"c615419b-72c3-48d9-91b0-918dc3215104\") " pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.556540 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk46m\" (UniqueName: \"kubernetes.io/projected/c615419b-72c3-48d9-91b0-918dc3215104-kube-api-access-wk46m\") pod \"certified-operators-lfd5t\" (UID: \"c615419b-72c3-48d9-91b0-918dc3215104\") " pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.676717 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.847159 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lfd5t"] Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.892795 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fc82477-8141-4654-9153-b2a046309e8b" path="/var/lib/kubelet/pods/6fc82477-8141-4654-9153-b2a046309e8b/volumes" Oct 03 18:18:46 crc kubenswrapper[4835]: I1003 18:18:46.893581 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed6d7146-dd06-4086-8b38-2140c5deeff9" path="/var/lib/kubelet/pods/ed6d7146-dd06-4086-8b38-2140c5deeff9/volumes" Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.068598 4835 generic.go:334] "Generic (PLEG): container finished" podID="c615419b-72c3-48d9-91b0-918dc3215104" containerID="c17fa9d573f7970a55b29b38f4588583fd0084623a6efde1644cb6de5a19443f" exitCode=0 Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.068724 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfd5t" event={"ID":"c615419b-72c3-48d9-91b0-918dc3215104","Type":"ContainerDied","Data":"c17fa9d573f7970a55b29b38f4588583fd0084623a6efde1644cb6de5a19443f"} Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.068776 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfd5t" event={"ID":"c615419b-72c3-48d9-91b0-918dc3215104","Type":"ContainerStarted","Data":"4d367ec8ac6768fdb1cdc1f33d3e6b2398e20b557b61507e8460f83cdd9e3d80"} Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.330291 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z6rtw"] Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.331403 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.337842 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.350726 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6rtw"] Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.452164 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20470f97-2625-4ea5-87b4-7eadb2bdf759-utilities\") pod \"redhat-operators-z6rtw\" (UID: \"20470f97-2625-4ea5-87b4-7eadb2bdf759\") " pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.452247 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dwzd\" (UniqueName: \"kubernetes.io/projected/20470f97-2625-4ea5-87b4-7eadb2bdf759-kube-api-access-6dwzd\") pod \"redhat-operators-z6rtw\" (UID: \"20470f97-2625-4ea5-87b4-7eadb2bdf759\") " pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.452320 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20470f97-2625-4ea5-87b4-7eadb2bdf759-catalog-content\") pod \"redhat-operators-z6rtw\" (UID: \"20470f97-2625-4ea5-87b4-7eadb2bdf759\") " pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.552926 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20470f97-2625-4ea5-87b4-7eadb2bdf759-catalog-content\") pod \"redhat-operators-z6rtw\" (UID: \"20470f97-2625-4ea5-87b4-7eadb2bdf759\") " pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.552998 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20470f97-2625-4ea5-87b4-7eadb2bdf759-utilities\") pod \"redhat-operators-z6rtw\" (UID: \"20470f97-2625-4ea5-87b4-7eadb2bdf759\") " pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.553040 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dwzd\" (UniqueName: \"kubernetes.io/projected/20470f97-2625-4ea5-87b4-7eadb2bdf759-kube-api-access-6dwzd\") pod \"redhat-operators-z6rtw\" (UID: \"20470f97-2625-4ea5-87b4-7eadb2bdf759\") " pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.553523 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20470f97-2625-4ea5-87b4-7eadb2bdf759-utilities\") pod \"redhat-operators-z6rtw\" (UID: \"20470f97-2625-4ea5-87b4-7eadb2bdf759\") " pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.553522 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20470f97-2625-4ea5-87b4-7eadb2bdf759-catalog-content\") pod \"redhat-operators-z6rtw\" (UID: \"20470f97-2625-4ea5-87b4-7eadb2bdf759\") " pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.572014 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dwzd\" (UniqueName: \"kubernetes.io/projected/20470f97-2625-4ea5-87b4-7eadb2bdf759-kube-api-access-6dwzd\") pod \"redhat-operators-z6rtw\" (UID: \"20470f97-2625-4ea5-87b4-7eadb2bdf759\") " pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.655202 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:47 crc kubenswrapper[4835]: I1003 18:18:47.821038 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6rtw"] Oct 03 18:18:47 crc kubenswrapper[4835]: W1003 18:18:47.826614 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20470f97_2625_4ea5_87b4_7eadb2bdf759.slice/crio-99eee0c07144d6ebc737194ff3ac1a152bd27213833f38b459aa35549b923441 WatchSource:0}: Error finding container 99eee0c07144d6ebc737194ff3ac1a152bd27213833f38b459aa35549b923441: Status 404 returned error can't find the container with id 99eee0c07144d6ebc737194ff3ac1a152bd27213833f38b459aa35549b923441 Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.077232 4835 generic.go:334] "Generic (PLEG): container finished" podID="20470f97-2625-4ea5-87b4-7eadb2bdf759" containerID="538e1b59735aa2a40122dc8f562081c5ebfcac7150047ce5e670e69a229a5ebc" exitCode=0 Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.077298 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6rtw" event={"ID":"20470f97-2625-4ea5-87b4-7eadb2bdf759","Type":"ContainerDied","Data":"538e1b59735aa2a40122dc8f562081c5ebfcac7150047ce5e670e69a229a5ebc"} Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.077323 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6rtw" event={"ID":"20470f97-2625-4ea5-87b4-7eadb2bdf759","Type":"ContainerStarted","Data":"99eee0c07144d6ebc737194ff3ac1a152bd27213833f38b459aa35549b923441"} Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.733495 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8p8gm"] Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.734875 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.737326 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.750845 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p8gm"] Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.869849 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af817b8-e0e4-43b7-9000-566f1ef27a80-catalog-content\") pod \"community-operators-8p8gm\" (UID: \"2af817b8-e0e4-43b7-9000-566f1ef27a80\") " pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.869943 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk7nv\" (UniqueName: \"kubernetes.io/projected/2af817b8-e0e4-43b7-9000-566f1ef27a80-kube-api-access-hk7nv\") pod \"community-operators-8p8gm\" (UID: \"2af817b8-e0e4-43b7-9000-566f1ef27a80\") " pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.869988 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af817b8-e0e4-43b7-9000-566f1ef27a80-utilities\") pod \"community-operators-8p8gm\" (UID: \"2af817b8-e0e4-43b7-9000-566f1ef27a80\") " pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.970619 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af817b8-e0e4-43b7-9000-566f1ef27a80-catalog-content\") pod \"community-operators-8p8gm\" (UID: \"2af817b8-e0e4-43b7-9000-566f1ef27a80\") " pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.970703 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk7nv\" (UniqueName: \"kubernetes.io/projected/2af817b8-e0e4-43b7-9000-566f1ef27a80-kube-api-access-hk7nv\") pod \"community-operators-8p8gm\" (UID: \"2af817b8-e0e4-43b7-9000-566f1ef27a80\") " pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.970744 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af817b8-e0e4-43b7-9000-566f1ef27a80-utilities\") pod \"community-operators-8p8gm\" (UID: \"2af817b8-e0e4-43b7-9000-566f1ef27a80\") " pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.971206 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2af817b8-e0e4-43b7-9000-566f1ef27a80-catalog-content\") pod \"community-operators-8p8gm\" (UID: \"2af817b8-e0e4-43b7-9000-566f1ef27a80\") " pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.971245 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2af817b8-e0e4-43b7-9000-566f1ef27a80-utilities\") pod \"community-operators-8p8gm\" (UID: \"2af817b8-e0e4-43b7-9000-566f1ef27a80\") " pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:48 crc kubenswrapper[4835]: I1003 18:18:48.988970 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk7nv\" (UniqueName: \"kubernetes.io/projected/2af817b8-e0e4-43b7-9000-566f1ef27a80-kube-api-access-hk7nv\") pod \"community-operators-8p8gm\" (UID: \"2af817b8-e0e4-43b7-9000-566f1ef27a80\") " pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.052342 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.093796 4835 generic.go:334] "Generic (PLEG): container finished" podID="c615419b-72c3-48d9-91b0-918dc3215104" containerID="6c412631493c4add7ac741de52f99bf62eb6d3edd04a0228cc5727b4d418dcac" exitCode=0 Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.093905 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfd5t" event={"ID":"c615419b-72c3-48d9-91b0-918dc3215104","Type":"ContainerDied","Data":"6c412631493c4add7ac741de52f99bf62eb6d3edd04a0228cc5727b4d418dcac"} Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.241843 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p8gm"] Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.734524 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbbwc"] Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.736997 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.739864 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.744725 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbbwc"] Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.881873 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d055144-3a57-4244-ba27-468b77001e54-utilities\") pod \"redhat-marketplace-sbbwc\" (UID: \"5d055144-3a57-4244-ba27-468b77001e54\") " pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.882253 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d055144-3a57-4244-ba27-468b77001e54-catalog-content\") pod \"redhat-marketplace-sbbwc\" (UID: \"5d055144-3a57-4244-ba27-468b77001e54\") " pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.882317 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mppqx\" (UniqueName: \"kubernetes.io/projected/5d055144-3a57-4244-ba27-468b77001e54-kube-api-access-mppqx\") pod \"redhat-marketplace-sbbwc\" (UID: \"5d055144-3a57-4244-ba27-468b77001e54\") " pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.984005 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d055144-3a57-4244-ba27-468b77001e54-utilities\") pod \"redhat-marketplace-sbbwc\" (UID: \"5d055144-3a57-4244-ba27-468b77001e54\") " pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.984173 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d055144-3a57-4244-ba27-468b77001e54-catalog-content\") pod \"redhat-marketplace-sbbwc\" (UID: \"5d055144-3a57-4244-ba27-468b77001e54\") " pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.984235 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mppqx\" (UniqueName: \"kubernetes.io/projected/5d055144-3a57-4244-ba27-468b77001e54-kube-api-access-mppqx\") pod \"redhat-marketplace-sbbwc\" (UID: \"5d055144-3a57-4244-ba27-468b77001e54\") " pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.984874 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d055144-3a57-4244-ba27-468b77001e54-utilities\") pod \"redhat-marketplace-sbbwc\" (UID: \"5d055144-3a57-4244-ba27-468b77001e54\") " pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:18:49 crc kubenswrapper[4835]: I1003 18:18:49.984941 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d055144-3a57-4244-ba27-468b77001e54-catalog-content\") pod \"redhat-marketplace-sbbwc\" (UID: \"5d055144-3a57-4244-ba27-468b77001e54\") " pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:18:50 crc kubenswrapper[4835]: I1003 18:18:50.010268 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mppqx\" (UniqueName: \"kubernetes.io/projected/5d055144-3a57-4244-ba27-468b77001e54-kube-api-access-mppqx\") pod \"redhat-marketplace-sbbwc\" (UID: \"5d055144-3a57-4244-ba27-468b77001e54\") " pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:18:50 crc kubenswrapper[4835]: I1003 18:18:50.063302 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:18:50 crc kubenswrapper[4835]: I1003 18:18:50.102795 4835 generic.go:334] "Generic (PLEG): container finished" podID="20470f97-2625-4ea5-87b4-7eadb2bdf759" containerID="eee5e147539362c5d58101ebedea9fab21aeceb5ffac0f07f4ee3332999f6b0e" exitCode=0 Oct 03 18:18:50 crc kubenswrapper[4835]: I1003 18:18:50.102861 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6rtw" event={"ID":"20470f97-2625-4ea5-87b4-7eadb2bdf759","Type":"ContainerDied","Data":"eee5e147539362c5d58101ebedea9fab21aeceb5ffac0f07f4ee3332999f6b0e"} Oct 03 18:18:50 crc kubenswrapper[4835]: I1003 18:18:50.105010 4835 generic.go:334] "Generic (PLEG): container finished" podID="2af817b8-e0e4-43b7-9000-566f1ef27a80" containerID="9d6fb419f134d9cc88f5c74806210a92b54db92b9ca3bccabfd70ae3a364e544" exitCode=0 Oct 03 18:18:50 crc kubenswrapper[4835]: I1003 18:18:50.105079 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p8gm" event={"ID":"2af817b8-e0e4-43b7-9000-566f1ef27a80","Type":"ContainerDied","Data":"9d6fb419f134d9cc88f5c74806210a92b54db92b9ca3bccabfd70ae3a364e544"} Oct 03 18:18:50 crc kubenswrapper[4835]: I1003 18:18:50.105123 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p8gm" event={"ID":"2af817b8-e0e4-43b7-9000-566f1ef27a80","Type":"ContainerStarted","Data":"5ab09de12d576b799e14a59711f283d7a3dcf2ecaa3cbdbbd9d0574d28b74417"} Oct 03 18:18:50 crc kubenswrapper[4835]: I1003 18:18:50.121335 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lfd5t" event={"ID":"c615419b-72c3-48d9-91b0-918dc3215104","Type":"ContainerStarted","Data":"6227370f0455d4afd33aec9fa6973f5f0084ec508183e5bf4342cb02ba0d66fc"} Oct 03 18:18:50 crc kubenswrapper[4835]: I1003 18:18:50.178272 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lfd5t" podStartSLOduration=1.613324586 podStartE2EDuration="4.178246361s" podCreationTimestamp="2025-10-03 18:18:46 +0000 UTC" firstStartedPulling="2025-10-03 18:18:47.070270867 +0000 UTC m=+268.786211739" lastFinishedPulling="2025-10-03 18:18:49.635192642 +0000 UTC m=+271.351133514" observedRunningTime="2025-10-03 18:18:50.154485178 +0000 UTC m=+271.870426060" watchObservedRunningTime="2025-10-03 18:18:50.178246361 +0000 UTC m=+271.894187233" Oct 03 18:18:50 crc kubenswrapper[4835]: I1003 18:18:50.282424 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbbwc"] Oct 03 18:18:51 crc kubenswrapper[4835]: I1003 18:18:51.127781 4835 generic.go:334] "Generic (PLEG): container finished" podID="5d055144-3a57-4244-ba27-468b77001e54" containerID="27cddff65abf18653747bc104a1b5c3ff865285efef8442c9d3c78eca2f57d91" exitCode=0 Oct 03 18:18:51 crc kubenswrapper[4835]: I1003 18:18:51.127819 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbbwc" event={"ID":"5d055144-3a57-4244-ba27-468b77001e54","Type":"ContainerDied","Data":"27cddff65abf18653747bc104a1b5c3ff865285efef8442c9d3c78eca2f57d91"} Oct 03 18:18:51 crc kubenswrapper[4835]: I1003 18:18:51.128326 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbbwc" event={"ID":"5d055144-3a57-4244-ba27-468b77001e54","Type":"ContainerStarted","Data":"613c62bce0149e702bf061e4202285490b514de0454181e772e27de8dd45d8ef"} Oct 03 18:18:52 crc kubenswrapper[4835]: I1003 18:18:52.135430 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p8gm" event={"ID":"2af817b8-e0e4-43b7-9000-566f1ef27a80","Type":"ContainerStarted","Data":"e1115b12c598b58e77a068465f1b7e3270ba72b3e625e703f56d6471223df05e"} Oct 03 18:18:52 crc kubenswrapper[4835]: I1003 18:18:52.138876 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6rtw" event={"ID":"20470f97-2625-4ea5-87b4-7eadb2bdf759","Type":"ContainerStarted","Data":"07270445c707a50b793dd6b47e7c447916a2f383bcf99d87a94d0d8e58da0ef6"} Oct 03 18:18:53 crc kubenswrapper[4835]: I1003 18:18:53.145859 4835 generic.go:334] "Generic (PLEG): container finished" podID="2af817b8-e0e4-43b7-9000-566f1ef27a80" containerID="e1115b12c598b58e77a068465f1b7e3270ba72b3e625e703f56d6471223df05e" exitCode=0 Oct 03 18:18:53 crc kubenswrapper[4835]: I1003 18:18:53.145985 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p8gm" event={"ID":"2af817b8-e0e4-43b7-9000-566f1ef27a80","Type":"ContainerDied","Data":"e1115b12c598b58e77a068465f1b7e3270ba72b3e625e703f56d6471223df05e"} Oct 03 18:18:53 crc kubenswrapper[4835]: I1003 18:18:53.148604 4835 generic.go:334] "Generic (PLEG): container finished" podID="5d055144-3a57-4244-ba27-468b77001e54" containerID="ef209a1336467252d3149f58215a3aea4d53a79a34f450838f629c32c77a2c92" exitCode=0 Oct 03 18:18:53 crc kubenswrapper[4835]: I1003 18:18:53.149136 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbbwc" event={"ID":"5d055144-3a57-4244-ba27-468b77001e54","Type":"ContainerDied","Data":"ef209a1336467252d3149f58215a3aea4d53a79a34f450838f629c32c77a2c92"} Oct 03 18:18:53 crc kubenswrapper[4835]: I1003 18:18:53.187225 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z6rtw" podStartSLOduration=3.639583907 podStartE2EDuration="6.187204144s" podCreationTimestamp="2025-10-03 18:18:47 +0000 UTC" firstStartedPulling="2025-10-03 18:18:48.078729053 +0000 UTC m=+269.794669925" lastFinishedPulling="2025-10-03 18:18:50.62634929 +0000 UTC m=+272.342290162" observedRunningTime="2025-10-03 18:18:53.185807046 +0000 UTC m=+274.901747918" watchObservedRunningTime="2025-10-03 18:18:53.187204144 +0000 UTC m=+274.903145136" Oct 03 18:18:54 crc kubenswrapper[4835]: I1003 18:18:54.156991 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbbwc" event={"ID":"5d055144-3a57-4244-ba27-468b77001e54","Type":"ContainerStarted","Data":"e6b31a387ef85172d9a3328242b1c012cd9ed5cd3fbe6be025e53309b071310a"} Oct 03 18:18:54 crc kubenswrapper[4835]: I1003 18:18:54.183172 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbbwc" podStartSLOduration=2.408245794 podStartE2EDuration="5.183153922s" podCreationTimestamp="2025-10-03 18:18:49 +0000 UTC" firstStartedPulling="2025-10-03 18:18:51.130729732 +0000 UTC m=+272.846670604" lastFinishedPulling="2025-10-03 18:18:53.90563786 +0000 UTC m=+275.621578732" observedRunningTime="2025-10-03 18:18:54.180331255 +0000 UTC m=+275.896272127" watchObservedRunningTime="2025-10-03 18:18:54.183153922 +0000 UTC m=+275.899094794" Oct 03 18:18:55 crc kubenswrapper[4835]: I1003 18:18:55.164567 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p8gm" event={"ID":"2af817b8-e0e4-43b7-9000-566f1ef27a80","Type":"ContainerStarted","Data":"7c051ecf798b3480e8c86139a1094c1f8ece7c592bcbfe3bcb8f0eff0dc89a57"} Oct 03 18:18:55 crc kubenswrapper[4835]: I1003 18:18:55.192600 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8p8gm" podStartSLOduration=3.292774029 podStartE2EDuration="7.192577634s" podCreationTimestamp="2025-10-03 18:18:48 +0000 UTC" firstStartedPulling="2025-10-03 18:18:50.11095694 +0000 UTC m=+271.826897812" lastFinishedPulling="2025-10-03 18:18:54.010760545 +0000 UTC m=+275.726701417" observedRunningTime="2025-10-03 18:18:55.188977546 +0000 UTC m=+276.904918418" watchObservedRunningTime="2025-10-03 18:18:55.192577634 +0000 UTC m=+276.908518506" Oct 03 18:18:56 crc kubenswrapper[4835]: I1003 18:18:56.677568 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:56 crc kubenswrapper[4835]: I1003 18:18:56.677818 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:56 crc kubenswrapper[4835]: I1003 18:18:56.716229 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:57 crc kubenswrapper[4835]: I1003 18:18:57.209315 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lfd5t" Oct 03 18:18:57 crc kubenswrapper[4835]: I1003 18:18:57.656039 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:57 crc kubenswrapper[4835]: I1003 18:18:57.656118 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:57 crc kubenswrapper[4835]: I1003 18:18:57.700874 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:58 crc kubenswrapper[4835]: I1003 18:18:58.214162 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z6rtw" Oct 03 18:18:59 crc kubenswrapper[4835]: I1003 18:18:59.052500 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:59 crc kubenswrapper[4835]: I1003 18:18:59.053888 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:59 crc kubenswrapper[4835]: I1003 18:18:59.091437 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:18:59 crc kubenswrapper[4835]: I1003 18:18:59.219159 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8p8gm" Oct 03 18:19:00 crc kubenswrapper[4835]: I1003 18:19:00.063532 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:19:00 crc kubenswrapper[4835]: I1003 18:19:00.063592 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:19:00 crc kubenswrapper[4835]: I1003 18:19:00.108216 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:19:00 crc kubenswrapper[4835]: I1003 18:19:00.249513 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbbwc" Oct 03 18:20:05 crc kubenswrapper[4835]: I1003 18:20:05.362753 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:20:05 crc kubenswrapper[4835]: I1003 18:20:05.363292 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:20:35 crc kubenswrapper[4835]: I1003 18:20:35.358668 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:20:35 crc kubenswrapper[4835]: I1003 18:20:35.359276 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:20:52 crc kubenswrapper[4835]: I1003 18:20:52.801414 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mlf8f"] Oct 03 18:20:52 crc kubenswrapper[4835]: I1003 18:20:52.802732 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:52 crc kubenswrapper[4835]: I1003 18:20:52.827388 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mlf8f"] Oct 03 18:20:52 crc kubenswrapper[4835]: I1003 18:20:52.924299 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg5pw\" (UniqueName: \"kubernetes.io/projected/46570336-0ab0-42cb-b578-69966b41e5eb-kube-api-access-qg5pw\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:52 crc kubenswrapper[4835]: I1003 18:20:52.924344 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/46570336-0ab0-42cb-b578-69966b41e5eb-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:52 crc kubenswrapper[4835]: I1003 18:20:52.924382 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/46570336-0ab0-42cb-b578-69966b41e5eb-registry-tls\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:52 crc kubenswrapper[4835]: I1003 18:20:52.924429 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/46570336-0ab0-42cb-b578-69966b41e5eb-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:52 crc kubenswrapper[4835]: I1003 18:20:52.924477 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46570336-0ab0-42cb-b578-69966b41e5eb-trusted-ca\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:52 crc kubenswrapper[4835]: I1003 18:20:52.924511 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/46570336-0ab0-42cb-b578-69966b41e5eb-registry-certificates\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:52 crc kubenswrapper[4835]: I1003 18:20:52.924534 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46570336-0ab0-42cb-b578-69966b41e5eb-bound-sa-token\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:52 crc kubenswrapper[4835]: I1003 18:20:52.924738 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:52 crc kubenswrapper[4835]: I1003 18:20:52.945780 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.026343 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg5pw\" (UniqueName: \"kubernetes.io/projected/46570336-0ab0-42cb-b578-69966b41e5eb-kube-api-access-qg5pw\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.026409 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/46570336-0ab0-42cb-b578-69966b41e5eb-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.026444 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/46570336-0ab0-42cb-b578-69966b41e5eb-registry-tls\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.026466 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/46570336-0ab0-42cb-b578-69966b41e5eb-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.026494 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46570336-0ab0-42cb-b578-69966b41e5eb-trusted-ca\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.026520 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/46570336-0ab0-42cb-b578-69966b41e5eb-registry-certificates\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.026553 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46570336-0ab0-42cb-b578-69966b41e5eb-bound-sa-token\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.027924 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46570336-0ab0-42cb-b578-69966b41e5eb-trusted-ca\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.027938 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/46570336-0ab0-42cb-b578-69966b41e5eb-registry-certificates\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.028253 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/46570336-0ab0-42cb-b578-69966b41e5eb-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.032299 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/46570336-0ab0-42cb-b578-69966b41e5eb-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.032944 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/46570336-0ab0-42cb-b578-69966b41e5eb-registry-tls\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.041190 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46570336-0ab0-42cb-b578-69966b41e5eb-bound-sa-token\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.042881 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg5pw\" (UniqueName: \"kubernetes.io/projected/46570336-0ab0-42cb-b578-69966b41e5eb-kube-api-access-qg5pw\") pod \"image-registry-66df7c8f76-mlf8f\" (UID: \"46570336-0ab0-42cb-b578-69966b41e5eb\") " pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.118167 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.285866 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mlf8f"] Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.733650 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" event={"ID":"46570336-0ab0-42cb-b578-69966b41e5eb","Type":"ContainerStarted","Data":"432f098df9552524b87d57c2c1b6b2229bffce394a9daa3aba2568fc37c2be3b"} Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.734005 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.734019 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" event={"ID":"46570336-0ab0-42cb-b578-69966b41e5eb","Type":"ContainerStarted","Data":"9d537023867bd0ef63d5b980c10243cfe19f4b16adcf36be751bd66cd35b193d"} Oct 03 18:20:53 crc kubenswrapper[4835]: I1003 18:20:53.756607 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" podStartSLOduration=1.756584634 podStartE2EDuration="1.756584634s" podCreationTimestamp="2025-10-03 18:20:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:20:53.756009639 +0000 UTC m=+395.471950521" watchObservedRunningTime="2025-10-03 18:20:53.756584634 +0000 UTC m=+395.472525506" Oct 03 18:21:05 crc kubenswrapper[4835]: I1003 18:21:05.358509 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:21:05 crc kubenswrapper[4835]: I1003 18:21:05.358941 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:21:05 crc kubenswrapper[4835]: I1003 18:21:05.358980 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:21:05 crc kubenswrapper[4835]: I1003 18:21:05.359599 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33b22db8eab068f3d27b86b574d9d679f3087e01aa6ee7e5483fdafa16b4a8b9"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 18:21:05 crc kubenswrapper[4835]: I1003 18:21:05.359657 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://33b22db8eab068f3d27b86b574d9d679f3087e01aa6ee7e5483fdafa16b4a8b9" gracePeriod=600 Oct 03 18:21:05 crc kubenswrapper[4835]: I1003 18:21:05.788931 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="33b22db8eab068f3d27b86b574d9d679f3087e01aa6ee7e5483fdafa16b4a8b9" exitCode=0 Oct 03 18:21:05 crc kubenswrapper[4835]: I1003 18:21:05.788984 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"33b22db8eab068f3d27b86b574d9d679f3087e01aa6ee7e5483fdafa16b4a8b9"} Oct 03 18:21:05 crc kubenswrapper[4835]: I1003 18:21:05.789317 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"f9f2478d03690f18cde85cd947722003ae9ebb4f9f69a11ddfa8dc6c6d386ff2"} Oct 03 18:21:05 crc kubenswrapper[4835]: I1003 18:21:05.789357 4835 scope.go:117] "RemoveContainer" containerID="82d7a841800fa2c5ae650b0fd9c6820ef4200d5ba1a566701c5f3db336f5ddaf" Oct 03 18:21:13 crc kubenswrapper[4835]: I1003 18:21:13.124675 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mlf8f" Oct 03 18:21:13 crc kubenswrapper[4835]: I1003 18:21:13.170605 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqldm"] Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.222951 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" podUID="13a0a3b4-f158-42a9-bfbb-2776aa6efe75" containerName="registry" containerID="cri-o://88ffba0983de35eb7dac69554c11c6d6eb33b5be9fb755e071662a50665c0e99" gracePeriod=30 Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.516269 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.622014 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-bound-sa-token\") pod \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.622049 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-registry-tls\") pod \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.622097 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-ca-trust-extracted\") pod \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.622136 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-registry-certificates\") pod \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.622166 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-installation-pull-secrets\") pod \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.622293 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.623263 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "13a0a3b4-f158-42a9-bfbb-2776aa6efe75" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.623571 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-trusted-ca\") pod \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.623602 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kttxv\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-kube-api-access-kttxv\") pod \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\" (UID: \"13a0a3b4-f158-42a9-bfbb-2776aa6efe75\") " Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.624136 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "13a0a3b4-f158-42a9-bfbb-2776aa6efe75" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.624503 4835 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.624523 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.628925 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "13a0a3b4-f158-42a9-bfbb-2776aa6efe75" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.629099 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-kube-api-access-kttxv" (OuterVolumeSpecName: "kube-api-access-kttxv") pod "13a0a3b4-f158-42a9-bfbb-2776aa6efe75" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75"). InnerVolumeSpecName "kube-api-access-kttxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.629414 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "13a0a3b4-f158-42a9-bfbb-2776aa6efe75" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.629844 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "13a0a3b4-f158-42a9-bfbb-2776aa6efe75" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.642398 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "13a0a3b4-f158-42a9-bfbb-2776aa6efe75" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.648155 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "13a0a3b4-f158-42a9-bfbb-2776aa6efe75" (UID: "13a0a3b4-f158-42a9-bfbb-2776aa6efe75"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.725671 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kttxv\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-kube-api-access-kttxv\") on node \"crc\" DevicePath \"\"" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.725703 4835 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.725715 4835 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.725723 4835 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.725733 4835 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/13a0a3b4-f158-42a9-bfbb-2776aa6efe75-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 18:21:38 crc kubenswrapper[4835]: I1003 18:21:38.999895 4835 generic.go:334] "Generic (PLEG): container finished" podID="13a0a3b4-f158-42a9-bfbb-2776aa6efe75" containerID="88ffba0983de35eb7dac69554c11c6d6eb33b5be9fb755e071662a50665c0e99" exitCode=0 Oct 03 18:21:39 crc kubenswrapper[4835]: I1003 18:21:38.999942 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" event={"ID":"13a0a3b4-f158-42a9-bfbb-2776aa6efe75","Type":"ContainerDied","Data":"88ffba0983de35eb7dac69554c11c6d6eb33b5be9fb755e071662a50665c0e99"} Oct 03 18:21:39 crc kubenswrapper[4835]: I1003 18:21:38.999998 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" Oct 03 18:21:39 crc kubenswrapper[4835]: I1003 18:21:39.000267 4835 scope.go:117] "RemoveContainer" containerID="88ffba0983de35eb7dac69554c11c6d6eb33b5be9fb755e071662a50665c0e99" Oct 03 18:21:39 crc kubenswrapper[4835]: I1003 18:21:39.000176 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqldm" event={"ID":"13a0a3b4-f158-42a9-bfbb-2776aa6efe75","Type":"ContainerDied","Data":"ee441ca13503c8c8d641bfcc8f526c2995f788925812a967c4e87f0ca06fc8bc"} Oct 03 18:21:39 crc kubenswrapper[4835]: I1003 18:21:39.018667 4835 scope.go:117] "RemoveContainer" containerID="88ffba0983de35eb7dac69554c11c6d6eb33b5be9fb755e071662a50665c0e99" Oct 03 18:21:39 crc kubenswrapper[4835]: E1003 18:21:39.019076 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88ffba0983de35eb7dac69554c11c6d6eb33b5be9fb755e071662a50665c0e99\": container with ID starting with 88ffba0983de35eb7dac69554c11c6d6eb33b5be9fb755e071662a50665c0e99 not found: ID does not exist" containerID="88ffba0983de35eb7dac69554c11c6d6eb33b5be9fb755e071662a50665c0e99" Oct 03 18:21:39 crc kubenswrapper[4835]: I1003 18:21:39.019109 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88ffba0983de35eb7dac69554c11c6d6eb33b5be9fb755e071662a50665c0e99"} err="failed to get container status \"88ffba0983de35eb7dac69554c11c6d6eb33b5be9fb755e071662a50665c0e99\": rpc error: code = NotFound desc = could not find container \"88ffba0983de35eb7dac69554c11c6d6eb33b5be9fb755e071662a50665c0e99\": container with ID starting with 88ffba0983de35eb7dac69554c11c6d6eb33b5be9fb755e071662a50665c0e99 not found: ID does not exist" Oct 03 18:21:39 crc kubenswrapper[4835]: I1003 18:21:39.021444 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqldm"] Oct 03 18:21:39 crc kubenswrapper[4835]: I1003 18:21:39.029225 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqldm"] Oct 03 18:21:40 crc kubenswrapper[4835]: I1003 18:21:40.883964 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a0a3b4-f158-42a9-bfbb-2776aa6efe75" path="/var/lib/kubelet/pods/13a0a3b4-f158-42a9-bfbb-2776aa6efe75/volumes" Oct 03 18:23:05 crc kubenswrapper[4835]: I1003 18:23:05.358879 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:23:05 crc kubenswrapper[4835]: I1003 18:23:05.359329 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:23:19 crc kubenswrapper[4835]: I1003 18:23:18.999671 4835 scope.go:117] "RemoveContainer" containerID="a5114f0d0fa2cc8cc37df6098e64b8553d7a7b3e4891cc210d3396a3d0050377" Oct 03 18:23:35 crc kubenswrapper[4835]: I1003 18:23:35.358197 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:23:35 crc kubenswrapper[4835]: I1003 18:23:35.358701 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:24:05 crc kubenswrapper[4835]: I1003 18:24:05.358499 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:24:05 crc kubenswrapper[4835]: I1003 18:24:05.358987 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:24:05 crc kubenswrapper[4835]: I1003 18:24:05.359033 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:24:05 crc kubenswrapper[4835]: I1003 18:24:05.359543 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9f2478d03690f18cde85cd947722003ae9ebb4f9f69a11ddfa8dc6c6d386ff2"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 18:24:05 crc kubenswrapper[4835]: I1003 18:24:05.359592 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://f9f2478d03690f18cde85cd947722003ae9ebb4f9f69a11ddfa8dc6c6d386ff2" gracePeriod=600 Oct 03 18:24:05 crc kubenswrapper[4835]: I1003 18:24:05.709061 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="f9f2478d03690f18cde85cd947722003ae9ebb4f9f69a11ddfa8dc6c6d386ff2" exitCode=0 Oct 03 18:24:05 crc kubenswrapper[4835]: I1003 18:24:05.709102 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"f9f2478d03690f18cde85cd947722003ae9ebb4f9f69a11ddfa8dc6c6d386ff2"} Oct 03 18:24:05 crc kubenswrapper[4835]: I1003 18:24:05.709409 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"6cbefddbf8736040316432cea35633f5b9fb0a39e77bcc8a41c22e4802ea88fe"} Oct 03 18:24:05 crc kubenswrapper[4835]: I1003 18:24:05.709431 4835 scope.go:117] "RemoveContainer" containerID="33b22db8eab068f3d27b86b574d9d679f3087e01aa6ee7e5483fdafa16b4a8b9" Oct 03 18:24:19 crc kubenswrapper[4835]: I1003 18:24:19.022059 4835 scope.go:117] "RemoveContainer" containerID="83d461505ea7acd3e1ca60e9b418e3f82905c45bfcc319eb86bc49700330d549" Oct 03 18:24:19 crc kubenswrapper[4835]: I1003 18:24:19.041367 4835 scope.go:117] "RemoveContainer" containerID="5c78f2004076a8578f224598072f4df4f279aff9d3185832bafcb9466ef75302" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.202789 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wsd6d"] Oct 03 18:24:24 crc kubenswrapper[4835]: E1003 18:24:24.203875 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a0a3b4-f158-42a9-bfbb-2776aa6efe75" containerName="registry" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.203891 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a0a3b4-f158-42a9-bfbb-2776aa6efe75" containerName="registry" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.203977 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a0a3b4-f158-42a9-bfbb-2776aa6efe75" containerName="registry" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.204526 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wsd6d" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.207742 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.208253 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.208428 4835 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wrwzg" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.209431 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xlhlh"] Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.210662 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-xlhlh" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.213772 4835 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-7kxff" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.221048 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wsd6d"] Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.228569 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xlhlh"] Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.232679 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-hxx5j"] Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.233695 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-hxx5j" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.236252 4835 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-2mkl8" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.244998 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-hxx5j"] Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.370114 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrdbz\" (UniqueName: \"kubernetes.io/projected/ca95ceab-81bd-4963-80c1-321c5b1c63ef-kube-api-access-jrdbz\") pod \"cert-manager-5b446d88c5-xlhlh\" (UID: \"ca95ceab-81bd-4963-80c1-321c5b1c63ef\") " pod="cert-manager/cert-manager-5b446d88c5-xlhlh" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.370180 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgfmp\" (UniqueName: \"kubernetes.io/projected/bb91054f-ac67-458b-9c77-5309597b870f-kube-api-access-kgfmp\") pod \"cert-manager-cainjector-7f985d654d-wsd6d\" (UID: \"bb91054f-ac67-458b-9c77-5309597b870f\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wsd6d" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.370322 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpshn\" (UniqueName: \"kubernetes.io/projected/3f302462-ea09-488c-93f0-48e3a331b672-kube-api-access-xpshn\") pod \"cert-manager-webhook-5655c58dd6-hxx5j\" (UID: \"3f302462-ea09-488c-93f0-48e3a331b672\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-hxx5j" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.470942 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrdbz\" (UniqueName: \"kubernetes.io/projected/ca95ceab-81bd-4963-80c1-321c5b1c63ef-kube-api-access-jrdbz\") pod \"cert-manager-5b446d88c5-xlhlh\" (UID: \"ca95ceab-81bd-4963-80c1-321c5b1c63ef\") " pod="cert-manager/cert-manager-5b446d88c5-xlhlh" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.471016 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgfmp\" (UniqueName: \"kubernetes.io/projected/bb91054f-ac67-458b-9c77-5309597b870f-kube-api-access-kgfmp\") pod \"cert-manager-cainjector-7f985d654d-wsd6d\" (UID: \"bb91054f-ac67-458b-9c77-5309597b870f\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wsd6d" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.471064 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpshn\" (UniqueName: \"kubernetes.io/projected/3f302462-ea09-488c-93f0-48e3a331b672-kube-api-access-xpshn\") pod \"cert-manager-webhook-5655c58dd6-hxx5j\" (UID: \"3f302462-ea09-488c-93f0-48e3a331b672\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-hxx5j" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.490271 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrdbz\" (UniqueName: \"kubernetes.io/projected/ca95ceab-81bd-4963-80c1-321c5b1c63ef-kube-api-access-jrdbz\") pod \"cert-manager-5b446d88c5-xlhlh\" (UID: \"ca95ceab-81bd-4963-80c1-321c5b1c63ef\") " pod="cert-manager/cert-manager-5b446d88c5-xlhlh" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.493707 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgfmp\" (UniqueName: \"kubernetes.io/projected/bb91054f-ac67-458b-9c77-5309597b870f-kube-api-access-kgfmp\") pod \"cert-manager-cainjector-7f985d654d-wsd6d\" (UID: \"bb91054f-ac67-458b-9c77-5309597b870f\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wsd6d" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.496739 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpshn\" (UniqueName: \"kubernetes.io/projected/3f302462-ea09-488c-93f0-48e3a331b672-kube-api-access-xpshn\") pod \"cert-manager-webhook-5655c58dd6-hxx5j\" (UID: \"3f302462-ea09-488c-93f0-48e3a331b672\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-hxx5j" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.534799 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wsd6d" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.542892 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-xlhlh" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.549568 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-hxx5j" Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.731282 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xlhlh"] Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.738644 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.795082 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-xlhlh" event={"ID":"ca95ceab-81bd-4963-80c1-321c5b1c63ef","Type":"ContainerStarted","Data":"9d23fa420a096ab81cd44d5ec18900e8251d73732d504a5c7b449582255a01b0"} Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.963355 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-hxx5j"] Oct 03 18:24:24 crc kubenswrapper[4835]: W1003 18:24:24.966325 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f302462_ea09_488c_93f0_48e3a331b672.slice/crio-3ebfbc6cfee9704395ae80fa169254039cc1242690b89dddc61fb9006cefb17e WatchSource:0}: Error finding container 3ebfbc6cfee9704395ae80fa169254039cc1242690b89dddc61fb9006cefb17e: Status 404 returned error can't find the container with id 3ebfbc6cfee9704395ae80fa169254039cc1242690b89dddc61fb9006cefb17e Oct 03 18:24:24 crc kubenswrapper[4835]: I1003 18:24:24.995032 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wsd6d"] Oct 03 18:24:24 crc kubenswrapper[4835]: W1003 18:24:24.995453 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb91054f_ac67_458b_9c77_5309597b870f.slice/crio-1db16cc093d078ef3ddc25fce7ec057d3e8e83408826ba964b820bcc3b9e2c3c WatchSource:0}: Error finding container 1db16cc093d078ef3ddc25fce7ec057d3e8e83408826ba964b820bcc3b9e2c3c: Status 404 returned error can't find the container with id 1db16cc093d078ef3ddc25fce7ec057d3e8e83408826ba964b820bcc3b9e2c3c Oct 03 18:24:25 crc kubenswrapper[4835]: I1003 18:24:25.802340 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-hxx5j" event={"ID":"3f302462-ea09-488c-93f0-48e3a331b672","Type":"ContainerStarted","Data":"3ebfbc6cfee9704395ae80fa169254039cc1242690b89dddc61fb9006cefb17e"} Oct 03 18:24:25 crc kubenswrapper[4835]: I1003 18:24:25.803130 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wsd6d" event={"ID":"bb91054f-ac67-458b-9c77-5309597b870f","Type":"ContainerStarted","Data":"1db16cc093d078ef3ddc25fce7ec057d3e8e83408826ba964b820bcc3b9e2c3c"} Oct 03 18:24:28 crc kubenswrapper[4835]: I1003 18:24:28.819656 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-xlhlh" event={"ID":"ca95ceab-81bd-4963-80c1-321c5b1c63ef","Type":"ContainerStarted","Data":"6bf01103e08b2dd707ccc01b004d7087e136b771c991d33d2ad5063032dbfa6a"} Oct 03 18:24:28 crc kubenswrapper[4835]: I1003 18:24:28.835862 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-xlhlh" podStartSLOduration=1.235522112 podStartE2EDuration="4.835842754s" podCreationTimestamp="2025-10-03 18:24:24 +0000 UTC" firstStartedPulling="2025-10-03 18:24:24.738375411 +0000 UTC m=+606.454316293" lastFinishedPulling="2025-10-03 18:24:28.338696063 +0000 UTC m=+610.054636935" observedRunningTime="2025-10-03 18:24:28.831787453 +0000 UTC m=+610.547728345" watchObservedRunningTime="2025-10-03 18:24:28.835842754 +0000 UTC m=+610.551783626" Oct 03 18:24:30 crc kubenswrapper[4835]: I1003 18:24:30.833627 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wsd6d" event={"ID":"bb91054f-ac67-458b-9c77-5309597b870f","Type":"ContainerStarted","Data":"a9403085753e47a3479f2e6e8b31c1dab47fd417d4fa6c34faa87453a026b93f"} Oct 03 18:24:30 crc kubenswrapper[4835]: I1003 18:24:30.835829 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-hxx5j" event={"ID":"3f302462-ea09-488c-93f0-48e3a331b672","Type":"ContainerStarted","Data":"f4c100fc65257a3ab30199f781746f7c82bbfc3abb9bd7e7b3f7d3060c052098"} Oct 03 18:24:30 crc kubenswrapper[4835]: I1003 18:24:30.836007 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-hxx5j" Oct 03 18:24:30 crc kubenswrapper[4835]: I1003 18:24:30.847376 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-wsd6d" podStartSLOduration=2.129825886 podStartE2EDuration="6.847360804s" podCreationTimestamp="2025-10-03 18:24:24 +0000 UTC" firstStartedPulling="2025-10-03 18:24:24.997925459 +0000 UTC m=+606.713866331" lastFinishedPulling="2025-10-03 18:24:29.715460377 +0000 UTC m=+611.431401249" observedRunningTime="2025-10-03 18:24:30.844923299 +0000 UTC m=+612.560864171" watchObservedRunningTime="2025-10-03 18:24:30.847360804 +0000 UTC m=+612.563301676" Oct 03 18:24:30 crc kubenswrapper[4835]: I1003 18:24:30.860182 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-hxx5j" podStartSLOduration=2.170242549 podStartE2EDuration="6.860164783s" podCreationTimestamp="2025-10-03 18:24:24 +0000 UTC" firstStartedPulling="2025-10-03 18:24:24.968307371 +0000 UTC m=+606.684248253" lastFinishedPulling="2025-10-03 18:24:29.658229615 +0000 UTC m=+611.374170487" observedRunningTime="2025-10-03 18:24:30.856955081 +0000 UTC m=+612.572895953" watchObservedRunningTime="2025-10-03 18:24:30.860164783 +0000 UTC m=+612.576105655" Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.552375 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-hxx5j" Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.785845 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p2w8j"] Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.786297 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovn-controller" containerID="cri-o://20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b" gracePeriod=30 Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.786568 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db" gracePeriod=30 Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.786648 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="sbdb" containerID="cri-o://ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af" gracePeriod=30 Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.786671 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovn-acl-logging" containerID="cri-o://d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c" gracePeriod=30 Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.786657 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="kube-rbac-proxy-node" containerID="cri-o://bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea" gracePeriod=30 Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.786737 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="nbdb" containerID="cri-o://50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8" gracePeriod=30 Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.786746 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="northd" containerID="cri-o://c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9" gracePeriod=30 Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.826370 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" containerID="cri-o://509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a" gracePeriod=30 Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.855805 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8p9cd_fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93/kube-multus/2.log" Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.856735 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8p9cd_fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93/kube-multus/1.log" Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.856777 4835 generic.go:334] "Generic (PLEG): container finished" podID="fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93" containerID="c4dfc32a4cce452f819127ad7835b9e48ebe8c563def12944d355e0868bed268" exitCode=2 Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.856804 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8p9cd" event={"ID":"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93","Type":"ContainerDied","Data":"c4dfc32a4cce452f819127ad7835b9e48ebe8c563def12944d355e0868bed268"} Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.856836 4835 scope.go:117] "RemoveContainer" containerID="12ccf52445e391368af99975592bd8f1206e9a136c9bc04732839082fcaecde1" Oct 03 18:24:34 crc kubenswrapper[4835]: I1003 18:24:34.857453 4835 scope.go:117] "RemoveContainer" containerID="c4dfc32a4cce452f819127ad7835b9e48ebe8c563def12944d355e0868bed268" Oct 03 18:24:34 crc kubenswrapper[4835]: E1003 18:24:34.857766 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8p9cd_openshift-multus(fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93)\"" pod="openshift-multus/multus-8p9cd" podUID="fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93" Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.027785 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a is running failed: container process not found" containerID="509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.028645 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a is running failed: container process not found" containerID="509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.028936 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a is running failed: container process not found" containerID="509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.029041 4835 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.073046 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/3.log" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.075993 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovn-acl-logging/0.log" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.076526 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovn-controller/0.log" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.077348 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127143 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lqvn5"] Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.127488 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovn-acl-logging" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127504 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovn-acl-logging" Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.127519 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="nbdb" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127526 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="nbdb" Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.127540 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127546 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.127553 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127559 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.127570 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="kubecfg-setup" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127576 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="kubecfg-setup" Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.127589 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="sbdb" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127595 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="sbdb" Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.127604 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127610 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.127617 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="northd" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127628 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="northd" Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.127639 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127646 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.127666 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovn-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127672 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovn-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.127680 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="kube-rbac-proxy-node" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127686 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="kube-rbac-proxy-node" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127981 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="kube-rbac-proxy-node" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.127998 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovn-acl-logging" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.128005 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.128030 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="sbdb" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.128044 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.128056 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.128152 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.128165 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="nbdb" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.128176 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.128194 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="northd" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.128213 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovn-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.128891 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.128941 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: E1003 18:24:35.128956 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.128963 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.129434 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerName="ovnkube-controller" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.131810 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204141 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-env-overrides\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204185 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-openvswitch\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204206 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovnkube-config\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204221 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-etc-openvswitch\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204251 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovnkube-script-lib\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204278 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-run-ovn-kubernetes\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204305 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204327 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-ovn\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204337 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204345 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-systemd-units\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204369 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204383 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-cni-bin\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204403 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204432 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204426 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204463 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204469 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-kubelet\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204486 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204503 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9z72\" (UniqueName: \"kubernetes.io/projected/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-kube-api-access-m9z72\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204512 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204524 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-var-lib-openvswitch\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204547 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-log-socket\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204573 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-node-log\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204605 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204607 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-run-netns\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204632 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204657 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-node-log" (OuterVolumeSpecName: "node-log") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204658 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovn-node-metrics-cert\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204696 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-systemd\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204735 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-cni-netd\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204735 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204748 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204763 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-slash\") pod \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\" (UID: \"48bbeb2a-b75a-4650-b5ea-b180b8c0168a\") " Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204820 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204844 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.204876 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-slash" (OuterVolumeSpecName: "host-slash") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205058 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-log-socket" (OuterVolumeSpecName: "log-socket") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205595 4835 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205623 4835 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-slash\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205639 4835 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205659 4835 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205713 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205740 4835 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205774 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205817 4835 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205904 4835 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205921 4835 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205931 4835 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205941 4835 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205949 4835 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205959 4835 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205967 4835 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-log-socket\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205975 4835 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-node-log\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.205983 4835 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.209332 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-kube-api-access-m9z72" (OuterVolumeSpecName: "kube-api-access-m9z72") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "kube-api-access-m9z72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.210847 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.216843 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "48bbeb2a-b75a-4650-b5ea-b180b8c0168a" (UID: "48bbeb2a-b75a-4650-b5ea-b180b8c0168a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.306891 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-run-openvswitch\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.306952 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-log-socket\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.306970 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-kubelet\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307025 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0fb4f271-21a5-4a09-9715-be9e1a588e0a-ovnkube-script-lib\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307091 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307157 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-node-log\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307184 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-cni-netd\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307210 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-run-netns\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307229 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0fb4f271-21a5-4a09-9715-be9e1a588e0a-ovnkube-config\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307246 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0fb4f271-21a5-4a09-9715-be9e1a588e0a-ovn-node-metrics-cert\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307268 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-cni-bin\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307402 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g4zn\" (UniqueName: \"kubernetes.io/projected/0fb4f271-21a5-4a09-9715-be9e1a588e0a-kube-api-access-5g4zn\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307475 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-systemd-units\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307500 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-run-systemd\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307517 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-etc-openvswitch\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307544 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-slash\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307580 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0fb4f271-21a5-4a09-9715-be9e1a588e0a-env-overrides\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307681 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-var-lib-openvswitch\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307713 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-run-ovn-kubernetes\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307736 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-run-ovn\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307776 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9z72\" (UniqueName: \"kubernetes.io/projected/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-kube-api-access-m9z72\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307788 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.307797 4835 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/48bbeb2a-b75a-4650-b5ea-b180b8c0168a-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408195 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-node-log\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408243 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-cni-netd\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408267 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-run-netns\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408287 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0fb4f271-21a5-4a09-9715-be9e1a588e0a-ovnkube-config\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408305 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0fb4f271-21a5-4a09-9715-be9e1a588e0a-ovn-node-metrics-cert\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408320 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-cni-bin\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408337 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g4zn\" (UniqueName: \"kubernetes.io/projected/0fb4f271-21a5-4a09-9715-be9e1a588e0a-kube-api-access-5g4zn\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408356 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-systemd-units\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408349 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-node-log\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408392 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-run-netns\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408434 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-cni-bin\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408371 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-run-systemd\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408413 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-run-systemd\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408478 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-etc-openvswitch\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408487 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-cni-netd\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408507 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-slash\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408534 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0fb4f271-21a5-4a09-9715-be9e1a588e0a-env-overrides\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408565 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-var-lib-openvswitch\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408582 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-run-ovn-kubernetes\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408598 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-run-ovn\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408625 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-run-openvswitch\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408643 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-slash\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408662 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-log-socket\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408682 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-kubelet\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408715 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0fb4f271-21a5-4a09-9715-be9e1a588e0a-ovnkube-script-lib\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408739 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408805 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-etc-openvswitch\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408812 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408847 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-systemd-units\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408913 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-run-openvswitch\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408937 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-var-lib-openvswitch\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408960 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-run-ovn-kubernetes\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.408981 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-run-ovn\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.409002 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-host-kubelet\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.409009 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0fb4f271-21a5-4a09-9715-be9e1a588e0a-ovnkube-config\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.409022 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0fb4f271-21a5-4a09-9715-be9e1a588e0a-log-socket\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.409285 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0fb4f271-21a5-4a09-9715-be9e1a588e0a-env-overrides\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.409664 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0fb4f271-21a5-4a09-9715-be9e1a588e0a-ovnkube-script-lib\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.411629 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0fb4f271-21a5-4a09-9715-be9e1a588e0a-ovn-node-metrics-cert\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.424411 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g4zn\" (UniqueName: \"kubernetes.io/projected/0fb4f271-21a5-4a09-9715-be9e1a588e0a-kube-api-access-5g4zn\") pod \"ovnkube-node-lqvn5\" (UID: \"0fb4f271-21a5-4a09-9715-be9e1a588e0a\") " pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.457180 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.861961 4835 generic.go:334] "Generic (PLEG): container finished" podID="0fb4f271-21a5-4a09-9715-be9e1a588e0a" containerID="a391f2757986a9c45e0d490ff897f3e875469d16a2ebc05d774f50e4fdf80734" exitCode=0 Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.862061 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" event={"ID":"0fb4f271-21a5-4a09-9715-be9e1a588e0a","Type":"ContainerDied","Data":"a391f2757986a9c45e0d490ff897f3e875469d16a2ebc05d774f50e4fdf80734"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.862389 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" event={"ID":"0fb4f271-21a5-4a09-9715-be9e1a588e0a","Type":"ContainerStarted","Data":"72819b7bc55f195ffa1115dfab8539d2bd8bfe9e405767834cae460bde412d75"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.864918 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8p9cd_fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93/kube-multus/2.log" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.868313 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovnkube-controller/3.log" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870062 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovn-acl-logging/0.log" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870451 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2w8j_48bbeb2a-b75a-4650-b5ea-b180b8c0168a/ovn-controller/0.log" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870677 4835 generic.go:334] "Generic (PLEG): container finished" podID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerID="509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a" exitCode=0 Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870703 4835 generic.go:334] "Generic (PLEG): container finished" podID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerID="ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af" exitCode=0 Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870711 4835 generic.go:334] "Generic (PLEG): container finished" podID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerID="50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8" exitCode=0 Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870718 4835 generic.go:334] "Generic (PLEG): container finished" podID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerID="c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9" exitCode=0 Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870725 4835 generic.go:334] "Generic (PLEG): container finished" podID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerID="0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db" exitCode=0 Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870732 4835 generic.go:334] "Generic (PLEG): container finished" podID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerID="bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea" exitCode=0 Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870738 4835 generic.go:334] "Generic (PLEG): container finished" podID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerID="d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c" exitCode=143 Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870746 4835 generic.go:334] "Generic (PLEG): container finished" podID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" containerID="20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b" exitCode=143 Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870763 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870783 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870793 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870802 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870810 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870818 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870828 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870836 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870842 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870848 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870853 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870858 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870862 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870867 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870872 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870878 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870885 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870891 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870896 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870901 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870906 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870911 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870917 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870922 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870926 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870931 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870937 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870945 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870952 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870957 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870962 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870968 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870973 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870978 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870984 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870989 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.870993 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.871000 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" event={"ID":"48bbeb2a-b75a-4650-b5ea-b180b8c0168a","Type":"ContainerDied","Data":"daaf33b06dcdd3e280762ba7c200e6bd76ea98f11b244c4a36077335274dd0f4"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.871008 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.871015 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.871020 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.871025 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.871030 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.871036 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.871041 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.871046 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.871050 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.871055 4835 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc"} Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.871085 4835 scope.go:117] "RemoveContainer" containerID="509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.871224 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p2w8j" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.897040 4835 scope.go:117] "RemoveContainer" containerID="9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.914739 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p2w8j"] Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.918033 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p2w8j"] Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.944751 4835 scope.go:117] "RemoveContainer" containerID="ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.961256 4835 scope.go:117] "RemoveContainer" containerID="50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.983632 4835 scope.go:117] "RemoveContainer" containerID="c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9" Oct 03 18:24:35 crc kubenswrapper[4835]: I1003 18:24:35.997610 4835 scope.go:117] "RemoveContainer" containerID="0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.010277 4835 scope.go:117] "RemoveContainer" containerID="bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.023212 4835 scope.go:117] "RemoveContainer" containerID="d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.036928 4835 scope.go:117] "RemoveContainer" containerID="20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.048999 4835 scope.go:117] "RemoveContainer" containerID="6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.076772 4835 scope.go:117] "RemoveContainer" containerID="509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a" Oct 03 18:24:36 crc kubenswrapper[4835]: E1003 18:24:36.077408 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a\": container with ID starting with 509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a not found: ID does not exist" containerID="509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.077455 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a"} err="failed to get container status \"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a\": rpc error: code = NotFound desc = could not find container \"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a\": container with ID starting with 509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.077487 4835 scope.go:117] "RemoveContainer" containerID="9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0" Oct 03 18:24:36 crc kubenswrapper[4835]: E1003 18:24:36.078144 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\": container with ID starting with 9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0 not found: ID does not exist" containerID="9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.078163 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0"} err="failed to get container status \"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\": rpc error: code = NotFound desc = could not find container \"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\": container with ID starting with 9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0 not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.078180 4835 scope.go:117] "RemoveContainer" containerID="ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af" Oct 03 18:24:36 crc kubenswrapper[4835]: E1003 18:24:36.078612 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\": container with ID starting with ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af not found: ID does not exist" containerID="ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.078631 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af"} err="failed to get container status \"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\": rpc error: code = NotFound desc = could not find container \"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\": container with ID starting with ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.078643 4835 scope.go:117] "RemoveContainer" containerID="50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8" Oct 03 18:24:36 crc kubenswrapper[4835]: E1003 18:24:36.078919 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\": container with ID starting with 50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8 not found: ID does not exist" containerID="50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.079007 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8"} err="failed to get container status \"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\": rpc error: code = NotFound desc = could not find container \"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\": container with ID starting with 50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8 not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.079051 4835 scope.go:117] "RemoveContainer" containerID="c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9" Oct 03 18:24:36 crc kubenswrapper[4835]: E1003 18:24:36.079363 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\": container with ID starting with c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9 not found: ID does not exist" containerID="c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.079394 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9"} err="failed to get container status \"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\": rpc error: code = NotFound desc = could not find container \"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\": container with ID starting with c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9 not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.079412 4835 scope.go:117] "RemoveContainer" containerID="0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db" Oct 03 18:24:36 crc kubenswrapper[4835]: E1003 18:24:36.079697 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\": container with ID starting with 0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db not found: ID does not exist" containerID="0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.079723 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db"} err="failed to get container status \"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\": rpc error: code = NotFound desc = could not find container \"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\": container with ID starting with 0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.079738 4835 scope.go:117] "RemoveContainer" containerID="bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea" Oct 03 18:24:36 crc kubenswrapper[4835]: E1003 18:24:36.080182 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\": container with ID starting with bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea not found: ID does not exist" containerID="bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.080270 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea"} err="failed to get container status \"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\": rpc error: code = NotFound desc = could not find container \"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\": container with ID starting with bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.080293 4835 scope.go:117] "RemoveContainer" containerID="d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c" Oct 03 18:24:36 crc kubenswrapper[4835]: E1003 18:24:36.080590 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\": container with ID starting with d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c not found: ID does not exist" containerID="d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.080608 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c"} err="failed to get container status \"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\": rpc error: code = NotFound desc = could not find container \"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\": container with ID starting with d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.080621 4835 scope.go:117] "RemoveContainer" containerID="20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b" Oct 03 18:24:36 crc kubenswrapper[4835]: E1003 18:24:36.080939 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\": container with ID starting with 20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b not found: ID does not exist" containerID="20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.080958 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b"} err="failed to get container status \"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\": rpc error: code = NotFound desc = could not find container \"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\": container with ID starting with 20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.080973 4835 scope.go:117] "RemoveContainer" containerID="6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc" Oct 03 18:24:36 crc kubenswrapper[4835]: E1003 18:24:36.081462 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\": container with ID starting with 6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc not found: ID does not exist" containerID="6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.081502 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc"} err="failed to get container status \"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\": rpc error: code = NotFound desc = could not find container \"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\": container with ID starting with 6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.081537 4835 scope.go:117] "RemoveContainer" containerID="509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.081812 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a"} err="failed to get container status \"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a\": rpc error: code = NotFound desc = could not find container \"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a\": container with ID starting with 509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.081839 4835 scope.go:117] "RemoveContainer" containerID="9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.082094 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0"} err="failed to get container status \"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\": rpc error: code = NotFound desc = could not find container \"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\": container with ID starting with 9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0 not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.082119 4835 scope.go:117] "RemoveContainer" containerID="ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.082414 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af"} err="failed to get container status \"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\": rpc error: code = NotFound desc = could not find container \"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\": container with ID starting with ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.082454 4835 scope.go:117] "RemoveContainer" containerID="50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.082676 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8"} err="failed to get container status \"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\": rpc error: code = NotFound desc = could not find container \"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\": container with ID starting with 50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8 not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.082701 4835 scope.go:117] "RemoveContainer" containerID="c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.082992 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9"} err="failed to get container status \"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\": rpc error: code = NotFound desc = could not find container \"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\": container with ID starting with c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9 not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.083052 4835 scope.go:117] "RemoveContainer" containerID="0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.083378 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db"} err="failed to get container status \"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\": rpc error: code = NotFound desc = could not find container \"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\": container with ID starting with 0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.083431 4835 scope.go:117] "RemoveContainer" containerID="bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.083656 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea"} err="failed to get container status \"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\": rpc error: code = NotFound desc = could not find container \"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\": container with ID starting with bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.083677 4835 scope.go:117] "RemoveContainer" containerID="d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.083924 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c"} err="failed to get container status \"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\": rpc error: code = NotFound desc = could not find container \"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\": container with ID starting with d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.083988 4835 scope.go:117] "RemoveContainer" containerID="20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.084255 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b"} err="failed to get container status \"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\": rpc error: code = NotFound desc = could not find container \"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\": container with ID starting with 20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.084273 4835 scope.go:117] "RemoveContainer" containerID="6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.084476 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc"} err="failed to get container status \"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\": rpc error: code = NotFound desc = could not find container \"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\": container with ID starting with 6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.084561 4835 scope.go:117] "RemoveContainer" containerID="509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.084831 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a"} err="failed to get container status \"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a\": rpc error: code = NotFound desc = could not find container \"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a\": container with ID starting with 509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.084853 4835 scope.go:117] "RemoveContainer" containerID="9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.085233 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0"} err="failed to get container status \"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\": rpc error: code = NotFound desc = could not find container \"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\": container with ID starting with 9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0 not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.085254 4835 scope.go:117] "RemoveContainer" containerID="ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.085474 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af"} err="failed to get container status \"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\": rpc error: code = NotFound desc = could not find container \"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\": container with ID starting with ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.085490 4835 scope.go:117] "RemoveContainer" containerID="50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.085766 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8"} err="failed to get container status \"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\": rpc error: code = NotFound desc = could not find container \"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\": container with ID starting with 50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8 not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.085795 4835 scope.go:117] "RemoveContainer" containerID="c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.086087 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9"} err="failed to get container status \"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\": rpc error: code = NotFound desc = could not find container \"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\": container with ID starting with c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9 not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.086112 4835 scope.go:117] "RemoveContainer" containerID="0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.086494 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db"} err="failed to get container status \"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\": rpc error: code = NotFound desc = could not find container \"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\": container with ID starting with 0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.086520 4835 scope.go:117] "RemoveContainer" containerID="bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.086723 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea"} err="failed to get container status \"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\": rpc error: code = NotFound desc = could not find container \"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\": container with ID starting with bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.086746 4835 scope.go:117] "RemoveContainer" containerID="d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.086920 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c"} err="failed to get container status \"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\": rpc error: code = NotFound desc = could not find container \"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\": container with ID starting with d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.086947 4835 scope.go:117] "RemoveContainer" containerID="20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.087170 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b"} err="failed to get container status \"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\": rpc error: code = NotFound desc = could not find container \"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\": container with ID starting with 20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.087196 4835 scope.go:117] "RemoveContainer" containerID="6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.087565 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc"} err="failed to get container status \"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\": rpc error: code = NotFound desc = could not find container \"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\": container with ID starting with 6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.087586 4835 scope.go:117] "RemoveContainer" containerID="509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.087788 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a"} err="failed to get container status \"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a\": rpc error: code = NotFound desc = could not find container \"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a\": container with ID starting with 509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.087808 4835 scope.go:117] "RemoveContainer" containerID="9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.088025 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0"} err="failed to get container status \"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\": rpc error: code = NotFound desc = could not find container \"9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0\": container with ID starting with 9cf095703ea05d39145db5a20669fdcfc3224bcdbb091444a46ae08dfe9be4c0 not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.088057 4835 scope.go:117] "RemoveContainer" containerID="ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.088407 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af"} err="failed to get container status \"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\": rpc error: code = NotFound desc = could not find container \"ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af\": container with ID starting with ed5c30034a6b401053d3ef37570b416a0f767a71ed0becf720faa4b962de59af not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.088430 4835 scope.go:117] "RemoveContainer" containerID="50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.088674 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8"} err="failed to get container status \"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\": rpc error: code = NotFound desc = could not find container \"50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8\": container with ID starting with 50fc7e8fae1bbaffdfe0d8600cd79f4cdea352ab61630c0fb605d698995854c8 not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.088695 4835 scope.go:117] "RemoveContainer" containerID="c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.088996 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9"} err="failed to get container status \"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\": rpc error: code = NotFound desc = could not find container \"c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9\": container with ID starting with c8c3d4ec3bb9d61afd78e7d9912f6a4db1cc7302ea98ade3cbd6a85fbfd6d1a9 not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.089025 4835 scope.go:117] "RemoveContainer" containerID="0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.089308 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db"} err="failed to get container status \"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\": rpc error: code = NotFound desc = could not find container \"0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db\": container with ID starting with 0aebc06737b22bfe1ca5361c6a98c9479c7ba07e12e593ed7d1e9c543952b9db not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.089332 4835 scope.go:117] "RemoveContainer" containerID="bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.089634 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea"} err="failed to get container status \"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\": rpc error: code = NotFound desc = could not find container \"bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea\": container with ID starting with bbb998637df300d3f873e6a0a9186091fc52e7b45c409210f70d8acd03fe1dea not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.089683 4835 scope.go:117] "RemoveContainer" containerID="d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.089970 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c"} err="failed to get container status \"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\": rpc error: code = NotFound desc = could not find container \"d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c\": container with ID starting with d7332d87f5d9b744dbaced9fa59714ecba0caae4bd4c1bbe20f3486c8fc2098c not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.089992 4835 scope.go:117] "RemoveContainer" containerID="20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.090261 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b"} err="failed to get container status \"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\": rpc error: code = NotFound desc = could not find container \"20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b\": container with ID starting with 20f71f77603378ac510e3532afb9d2d856fcdbd5e3a2451ff0d88bdee6447d0b not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.090351 4835 scope.go:117] "RemoveContainer" containerID="6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.090635 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc"} err="failed to get container status \"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\": rpc error: code = NotFound desc = could not find container \"6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc\": container with ID starting with 6a0da67916028923fc4d160d36703c530a06dff2133264d4e2d00dc486712ebc not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.090666 4835 scope.go:117] "RemoveContainer" containerID="509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.090928 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a"} err="failed to get container status \"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a\": rpc error: code = NotFound desc = could not find container \"509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a\": container with ID starting with 509d7cf889942463b96f7bb3831b055d3ed95a4e06635523791c53faaa9f5e1a not found: ID does not exist" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.882006 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48bbeb2a-b75a-4650-b5ea-b180b8c0168a" path="/var/lib/kubelet/pods/48bbeb2a-b75a-4650-b5ea-b180b8c0168a/volumes" Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.883259 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" event={"ID":"0fb4f271-21a5-4a09-9715-be9e1a588e0a","Type":"ContainerStarted","Data":"86cf7319ed44991b751b5caa3ccc8610dcd753e0a745ade5f3369bd744f07219"} Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.883287 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" event={"ID":"0fb4f271-21a5-4a09-9715-be9e1a588e0a","Type":"ContainerStarted","Data":"7b47cc5d4561082ee5854b5cd276bbbd730beb9bcc03a0396c029fb9dfb865ac"} Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.883297 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" event={"ID":"0fb4f271-21a5-4a09-9715-be9e1a588e0a","Type":"ContainerStarted","Data":"bbbda4535b90096ff05f82f36f3fe3721a56fdd12117defd609af1589dcf9f0b"} Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.883306 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" event={"ID":"0fb4f271-21a5-4a09-9715-be9e1a588e0a","Type":"ContainerStarted","Data":"6ed6117d3dcf51e536c5ea618fcceaa918f0cf03b373044aae0a996bf2d505c8"} Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.883314 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" event={"ID":"0fb4f271-21a5-4a09-9715-be9e1a588e0a","Type":"ContainerStarted","Data":"608307fbd78d84b9fc424bed3521e4e7c7b09a2f83f28fe1f0f9b7c2bacbd209"} Oct 03 18:24:36 crc kubenswrapper[4835]: I1003 18:24:36.883324 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" event={"ID":"0fb4f271-21a5-4a09-9715-be9e1a588e0a","Type":"ContainerStarted","Data":"c2cc4c2f99aee5efd78fd281a3e525886f9f43da146a91e28c0ff3e9df7f6547"} Oct 03 18:24:38 crc kubenswrapper[4835]: I1003 18:24:38.893033 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" event={"ID":"0fb4f271-21a5-4a09-9715-be9e1a588e0a","Type":"ContainerStarted","Data":"bf666dd6ae3139b5277caa15352d777c1fe28a9bd0c43c900ea31ebe6dce71e3"} Oct 03 18:24:41 crc kubenswrapper[4835]: I1003 18:24:41.910333 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" event={"ID":"0fb4f271-21a5-4a09-9715-be9e1a588e0a","Type":"ContainerStarted","Data":"96ed6fa2870176eef95ddea7d53411b05021c0dd77aab694cf584a993e48dfab"} Oct 03 18:24:41 crc kubenswrapper[4835]: I1003 18:24:41.910887 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:41 crc kubenswrapper[4835]: I1003 18:24:41.910903 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:41 crc kubenswrapper[4835]: I1003 18:24:41.936708 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:41 crc kubenswrapper[4835]: I1003 18:24:41.938205 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" podStartSLOduration=6.93818876 podStartE2EDuration="6.93818876s" podCreationTimestamp="2025-10-03 18:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:24:41.936696616 +0000 UTC m=+623.652637488" watchObservedRunningTime="2025-10-03 18:24:41.93818876 +0000 UTC m=+623.654129632" Oct 03 18:24:42 crc kubenswrapper[4835]: I1003 18:24:42.915379 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:42 crc kubenswrapper[4835]: I1003 18:24:42.939389 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:24:48 crc kubenswrapper[4835]: I1003 18:24:48.880184 4835 scope.go:117] "RemoveContainer" containerID="c4dfc32a4cce452f819127ad7835b9e48ebe8c563def12944d355e0868bed268" Oct 03 18:24:48 crc kubenswrapper[4835]: E1003 18:24:48.880904 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8p9cd_openshift-multus(fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93)\"" pod="openshift-multus/multus-8p9cd" podUID="fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93" Oct 03 18:25:02 crc kubenswrapper[4835]: I1003 18:25:02.876822 4835 scope.go:117] "RemoveContainer" containerID="c4dfc32a4cce452f819127ad7835b9e48ebe8c563def12944d355e0868bed268" Oct 03 18:25:03 crc kubenswrapper[4835]: I1003 18:25:03.424273 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz"] Oct 03 18:25:03 crc kubenswrapper[4835]: I1003 18:25:03.426061 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:03 crc kubenswrapper[4835]: I1003 18:25:03.431303 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz"] Oct 03 18:25:03 crc kubenswrapper[4835]: I1003 18:25:03.432204 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 18:25:03 crc kubenswrapper[4835]: I1003 18:25:03.619313 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/368259bf-9e2d-45b2-9c90-98b0f1081180-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz\" (UID: \"368259bf-9e2d-45b2-9c90-98b0f1081180\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:03 crc kubenswrapper[4835]: I1003 18:25:03.619639 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqqg6\" (UniqueName: \"kubernetes.io/projected/368259bf-9e2d-45b2-9c90-98b0f1081180-kube-api-access-lqqg6\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz\" (UID: \"368259bf-9e2d-45b2-9c90-98b0f1081180\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:03 crc kubenswrapper[4835]: I1003 18:25:03.619757 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/368259bf-9e2d-45b2-9c90-98b0f1081180-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz\" (UID: \"368259bf-9e2d-45b2-9c90-98b0f1081180\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:03 crc kubenswrapper[4835]: I1003 18:25:03.721441 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/368259bf-9e2d-45b2-9c90-98b0f1081180-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz\" (UID: \"368259bf-9e2d-45b2-9c90-98b0f1081180\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:03 crc kubenswrapper[4835]: I1003 18:25:03.721767 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqqg6\" (UniqueName: \"kubernetes.io/projected/368259bf-9e2d-45b2-9c90-98b0f1081180-kube-api-access-lqqg6\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz\" (UID: \"368259bf-9e2d-45b2-9c90-98b0f1081180\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:03 crc kubenswrapper[4835]: I1003 18:25:03.721933 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/368259bf-9e2d-45b2-9c90-98b0f1081180-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz\" (UID: \"368259bf-9e2d-45b2-9c90-98b0f1081180\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:03 crc kubenswrapper[4835]: I1003 18:25:03.721890 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/368259bf-9e2d-45b2-9c90-98b0f1081180-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz\" (UID: \"368259bf-9e2d-45b2-9c90-98b0f1081180\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:03 crc kubenswrapper[4835]: I1003 18:25:03.722229 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/368259bf-9e2d-45b2-9c90-98b0f1081180-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz\" (UID: \"368259bf-9e2d-45b2-9c90-98b0f1081180\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:03 crc kubenswrapper[4835]: I1003 18:25:03.740557 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqqg6\" (UniqueName: \"kubernetes.io/projected/368259bf-9e2d-45b2-9c90-98b0f1081180-kube-api-access-lqqg6\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz\" (UID: \"368259bf-9e2d-45b2-9c90-98b0f1081180\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:04 crc kubenswrapper[4835]: I1003 18:25:04.017006 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8p9cd_fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93/kube-multus/2.log" Oct 03 18:25:04 crc kubenswrapper[4835]: I1003 18:25:04.017157 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8p9cd" event={"ID":"fd3bdc71-e8c7-4cfa-9230-5bb1c413ae93","Type":"ContainerStarted","Data":"feb0dc1f82d8e2e43ff603e254130d8f9db54968372c6f562b34b8171073e114"} Oct 03 18:25:04 crc kubenswrapper[4835]: I1003 18:25:04.040083 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:04 crc kubenswrapper[4835]: E1003 18:25:04.065426 4835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz_openshift-marketplace_368259bf-9e2d-45b2-9c90-98b0f1081180_0(bfc753a2244dd4bf2f94bef3dbcb6e5d807f1487d9cefdf2e79887ea1051d7bc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 18:25:04 crc kubenswrapper[4835]: E1003 18:25:04.065498 4835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz_openshift-marketplace_368259bf-9e2d-45b2-9c90-98b0f1081180_0(bfc753a2244dd4bf2f94bef3dbcb6e5d807f1487d9cefdf2e79887ea1051d7bc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:04 crc kubenswrapper[4835]: E1003 18:25:04.067697 4835 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz_openshift-marketplace_368259bf-9e2d-45b2-9c90-98b0f1081180_0(bfc753a2244dd4bf2f94bef3dbcb6e5d807f1487d9cefdf2e79887ea1051d7bc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:04 crc kubenswrapper[4835]: E1003 18:25:04.067761 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz_openshift-marketplace(368259bf-9e2d-45b2-9c90-98b0f1081180)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz_openshift-marketplace(368259bf-9e2d-45b2-9c90-98b0f1081180)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz_openshift-marketplace_368259bf-9e2d-45b2-9c90-98b0f1081180_0(bfc753a2244dd4bf2f94bef3dbcb6e5d807f1487d9cefdf2e79887ea1051d7bc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" podUID="368259bf-9e2d-45b2-9c90-98b0f1081180" Oct 03 18:25:05 crc kubenswrapper[4835]: I1003 18:25:05.021762 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:05 crc kubenswrapper[4835]: I1003 18:25:05.022244 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:05 crc kubenswrapper[4835]: I1003 18:25:05.182585 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz"] Oct 03 18:25:05 crc kubenswrapper[4835]: I1003 18:25:05.488257 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lqvn5" Oct 03 18:25:06 crc kubenswrapper[4835]: I1003 18:25:06.027238 4835 generic.go:334] "Generic (PLEG): container finished" podID="368259bf-9e2d-45b2-9c90-98b0f1081180" containerID="3a97a5dc5a398009c1c0375ab7be247e548e9e7ee30f9e1e20ac55e861a1928f" exitCode=0 Oct 03 18:25:06 crc kubenswrapper[4835]: I1003 18:25:06.027505 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" event={"ID":"368259bf-9e2d-45b2-9c90-98b0f1081180","Type":"ContainerDied","Data":"3a97a5dc5a398009c1c0375ab7be247e548e9e7ee30f9e1e20ac55e861a1928f"} Oct 03 18:25:06 crc kubenswrapper[4835]: I1003 18:25:06.027528 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" event={"ID":"368259bf-9e2d-45b2-9c90-98b0f1081180","Type":"ContainerStarted","Data":"667c0162dcd75625e3338e6fd06077bbd1d5af988febdc54ab1b4ed42048d7e5"} Oct 03 18:25:08 crc kubenswrapper[4835]: I1003 18:25:08.037589 4835 generic.go:334] "Generic (PLEG): container finished" podID="368259bf-9e2d-45b2-9c90-98b0f1081180" containerID="d7f4fac5b9f3fbf7ae9aa6a40aa3941fd5e5dd9406bb5175bff72ad7f7ff6b82" exitCode=0 Oct 03 18:25:08 crc kubenswrapper[4835]: I1003 18:25:08.037633 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" event={"ID":"368259bf-9e2d-45b2-9c90-98b0f1081180","Type":"ContainerDied","Data":"d7f4fac5b9f3fbf7ae9aa6a40aa3941fd5e5dd9406bb5175bff72ad7f7ff6b82"} Oct 03 18:25:09 crc kubenswrapper[4835]: I1003 18:25:09.043898 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" event={"ID":"368259bf-9e2d-45b2-9c90-98b0f1081180","Type":"ContainerStarted","Data":"e9c9b3efc21d83e1b7ed0b64318a4879a7606fdf70c1aa799d3932655fa5b540"} Oct 03 18:25:09 crc kubenswrapper[4835]: I1003 18:25:09.058292 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" podStartSLOduration=4.616084448 podStartE2EDuration="6.058271098s" podCreationTimestamp="2025-10-03 18:25:03 +0000 UTC" firstStartedPulling="2025-10-03 18:25:06.029112679 +0000 UTC m=+647.745053551" lastFinishedPulling="2025-10-03 18:25:07.471299329 +0000 UTC m=+649.187240201" observedRunningTime="2025-10-03 18:25:09.057591293 +0000 UTC m=+650.773532165" watchObservedRunningTime="2025-10-03 18:25:09.058271098 +0000 UTC m=+650.774211970" Oct 03 18:25:11 crc kubenswrapper[4835]: I1003 18:25:11.053275 4835 generic.go:334] "Generic (PLEG): container finished" podID="368259bf-9e2d-45b2-9c90-98b0f1081180" containerID="e9c9b3efc21d83e1b7ed0b64318a4879a7606fdf70c1aa799d3932655fa5b540" exitCode=0 Oct 03 18:25:11 crc kubenswrapper[4835]: I1003 18:25:11.053307 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" event={"ID":"368259bf-9e2d-45b2-9c90-98b0f1081180","Type":"ContainerDied","Data":"e9c9b3efc21d83e1b7ed0b64318a4879a7606fdf70c1aa799d3932655fa5b540"} Oct 03 18:25:12 crc kubenswrapper[4835]: I1003 18:25:12.254592 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:12 crc kubenswrapper[4835]: I1003 18:25:12.419607 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/368259bf-9e2d-45b2-9c90-98b0f1081180-bundle\") pod \"368259bf-9e2d-45b2-9c90-98b0f1081180\" (UID: \"368259bf-9e2d-45b2-9c90-98b0f1081180\") " Oct 03 18:25:12 crc kubenswrapper[4835]: I1003 18:25:12.419909 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/368259bf-9e2d-45b2-9c90-98b0f1081180-util\") pod \"368259bf-9e2d-45b2-9c90-98b0f1081180\" (UID: \"368259bf-9e2d-45b2-9c90-98b0f1081180\") " Oct 03 18:25:12 crc kubenswrapper[4835]: I1003 18:25:12.419999 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqqg6\" (UniqueName: \"kubernetes.io/projected/368259bf-9e2d-45b2-9c90-98b0f1081180-kube-api-access-lqqg6\") pod \"368259bf-9e2d-45b2-9c90-98b0f1081180\" (UID: \"368259bf-9e2d-45b2-9c90-98b0f1081180\") " Oct 03 18:25:12 crc kubenswrapper[4835]: I1003 18:25:12.421427 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/368259bf-9e2d-45b2-9c90-98b0f1081180-bundle" (OuterVolumeSpecName: "bundle") pod "368259bf-9e2d-45b2-9c90-98b0f1081180" (UID: "368259bf-9e2d-45b2-9c90-98b0f1081180"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:25:12 crc kubenswrapper[4835]: I1003 18:25:12.425475 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/368259bf-9e2d-45b2-9c90-98b0f1081180-kube-api-access-lqqg6" (OuterVolumeSpecName: "kube-api-access-lqqg6") pod "368259bf-9e2d-45b2-9c90-98b0f1081180" (UID: "368259bf-9e2d-45b2-9c90-98b0f1081180"). InnerVolumeSpecName "kube-api-access-lqqg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:25:12 crc kubenswrapper[4835]: I1003 18:25:12.430272 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/368259bf-9e2d-45b2-9c90-98b0f1081180-util" (OuterVolumeSpecName: "util") pod "368259bf-9e2d-45b2-9c90-98b0f1081180" (UID: "368259bf-9e2d-45b2-9c90-98b0f1081180"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:25:12 crc kubenswrapper[4835]: I1003 18:25:12.521527 4835 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/368259bf-9e2d-45b2-9c90-98b0f1081180-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:25:12 crc kubenswrapper[4835]: I1003 18:25:12.521567 4835 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/368259bf-9e2d-45b2-9c90-98b0f1081180-util\") on node \"crc\" DevicePath \"\"" Oct 03 18:25:12 crc kubenswrapper[4835]: I1003 18:25:12.521622 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqqg6\" (UniqueName: \"kubernetes.io/projected/368259bf-9e2d-45b2-9c90-98b0f1081180-kube-api-access-lqqg6\") on node \"crc\" DevicePath \"\"" Oct 03 18:25:13 crc kubenswrapper[4835]: I1003 18:25:13.064496 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" event={"ID":"368259bf-9e2d-45b2-9c90-98b0f1081180","Type":"ContainerDied","Data":"667c0162dcd75625e3338e6fd06077bbd1d5af988febdc54ab1b4ed42048d7e5"} Oct 03 18:25:13 crc kubenswrapper[4835]: I1003 18:25:13.064540 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="667c0162dcd75625e3338e6fd06077bbd1d5af988febdc54ab1b4ed42048d7e5" Oct 03 18:25:13 crc kubenswrapper[4835]: I1003 18:25:13.064550 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.063054 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-9x6c4"] Oct 03 18:25:20 crc kubenswrapper[4835]: E1003 18:25:20.064931 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368259bf-9e2d-45b2-9c90-98b0f1081180" containerName="extract" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.065139 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="368259bf-9e2d-45b2-9c90-98b0f1081180" containerName="extract" Oct 03 18:25:20 crc kubenswrapper[4835]: E1003 18:25:20.065235 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368259bf-9e2d-45b2-9c90-98b0f1081180" containerName="util" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.065309 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="368259bf-9e2d-45b2-9c90-98b0f1081180" containerName="util" Oct 03 18:25:20 crc kubenswrapper[4835]: E1003 18:25:20.065385 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368259bf-9e2d-45b2-9c90-98b0f1081180" containerName="pull" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.065452 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="368259bf-9e2d-45b2-9c90-98b0f1081180" containerName="pull" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.065629 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="368259bf-9e2d-45b2-9c90-98b0f1081180" containerName="extract" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.066196 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-9x6c4" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.069580 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.069822 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-f6m4q" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.069978 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.078047 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-9x6c4"] Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.189152 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t"] Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.190229 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.192761 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.193187 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-ngcdx" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.198701 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x"] Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.199525 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.209376 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcrd8\" (UniqueName: \"kubernetes.io/projected/d0c975ce-2198-4163-b431-7bad685dab35-kube-api-access-gcrd8\") pod \"obo-prometheus-operator-7c8cf85677-9x6c4\" (UID: \"d0c975ce-2198-4163-b431-7bad685dab35\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-9x6c4" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.210178 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t"] Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.229001 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x"] Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.300349 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-lhh2c"] Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.301567 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-lhh2c" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.303649 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.305375 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-xvg6p" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.310343 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f7d6c825-42e2-4396-b3c4-b93c3c2f9442-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5474f66f9-2225x\" (UID: \"f7d6c825-42e2-4396-b3c4-b93c3c2f9442\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.310671 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec28bb25-3e95-4d64-b5b2-7fecfa63db71-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t\" (UID: \"ec28bb25-3e95-4d64-b5b2-7fecfa63db71\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.310905 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcrd8\" (UniqueName: \"kubernetes.io/projected/d0c975ce-2198-4163-b431-7bad685dab35-kube-api-access-gcrd8\") pod \"obo-prometheus-operator-7c8cf85677-9x6c4\" (UID: \"d0c975ce-2198-4163-b431-7bad685dab35\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-9x6c4" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.311051 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f7d6c825-42e2-4396-b3c4-b93c3c2f9442-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5474f66f9-2225x\" (UID: \"f7d6c825-42e2-4396-b3c4-b93c3c2f9442\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.311186 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec28bb25-3e95-4d64-b5b2-7fecfa63db71-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t\" (UID: \"ec28bb25-3e95-4d64-b5b2-7fecfa63db71\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.316308 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-lhh2c"] Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.337497 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcrd8\" (UniqueName: \"kubernetes.io/projected/d0c975ce-2198-4163-b431-7bad685dab35-kube-api-access-gcrd8\") pod \"obo-prometheus-operator-7c8cf85677-9x6c4\" (UID: \"d0c975ce-2198-4163-b431-7bad685dab35\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-9x6c4" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.387927 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-9x6c4" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.400607 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-cq5g4"] Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.401703 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-cq5g4" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.409327 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6n7k8" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.412527 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f7d6c825-42e2-4396-b3c4-b93c3c2f9442-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5474f66f9-2225x\" (UID: \"f7d6c825-42e2-4396-b3c4-b93c3c2f9442\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.412721 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec28bb25-3e95-4d64-b5b2-7fecfa63db71-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t\" (UID: \"ec28bb25-3e95-4d64-b5b2-7fecfa63db71\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.412849 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f7d6c825-42e2-4396-b3c4-b93c3c2f9442-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5474f66f9-2225x\" (UID: \"f7d6c825-42e2-4396-b3c4-b93c3c2f9442\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.412950 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec28bb25-3e95-4d64-b5b2-7fecfa63db71-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t\" (UID: \"ec28bb25-3e95-4d64-b5b2-7fecfa63db71\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.413050 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4334b8f1-99d4-4676-a64c-68704cfe50a8-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-lhh2c\" (UID: \"4334b8f1-99d4-4676-a64c-68704cfe50a8\") " pod="openshift-operators/observability-operator-cc5f78dfc-lhh2c" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.413213 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj7qb\" (UniqueName: \"kubernetes.io/projected/4334b8f1-99d4-4676-a64c-68704cfe50a8-kube-api-access-tj7qb\") pod \"observability-operator-cc5f78dfc-lhh2c\" (UID: \"4334b8f1-99d4-4676-a64c-68704cfe50a8\") " pod="openshift-operators/observability-operator-cc5f78dfc-lhh2c" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.418045 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-cq5g4"] Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.418517 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec28bb25-3e95-4d64-b5b2-7fecfa63db71-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t\" (UID: \"ec28bb25-3e95-4d64-b5b2-7fecfa63db71\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.419390 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f7d6c825-42e2-4396-b3c4-b93c3c2f9442-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5474f66f9-2225x\" (UID: \"f7d6c825-42e2-4396-b3c4-b93c3c2f9442\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.419427 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec28bb25-3e95-4d64-b5b2-7fecfa63db71-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t\" (UID: \"ec28bb25-3e95-4d64-b5b2-7fecfa63db71\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.421255 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f7d6c825-42e2-4396-b3c4-b93c3c2f9442-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5474f66f9-2225x\" (UID: \"f7d6c825-42e2-4396-b3c4-b93c3c2f9442\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.512048 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.514788 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/477eb3b8-9f75-4dd3-bc90-ec855d242dc8-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-cq5g4\" (UID: \"477eb3b8-9f75-4dd3-bc90-ec855d242dc8\") " pod="openshift-operators/perses-operator-54bc95c9fb-cq5g4" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.514831 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4334b8f1-99d4-4676-a64c-68704cfe50a8-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-lhh2c\" (UID: \"4334b8f1-99d4-4676-a64c-68704cfe50a8\") " pod="openshift-operators/observability-operator-cc5f78dfc-lhh2c" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.514854 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68mrt\" (UniqueName: \"kubernetes.io/projected/477eb3b8-9f75-4dd3-bc90-ec855d242dc8-kube-api-access-68mrt\") pod \"perses-operator-54bc95c9fb-cq5g4\" (UID: \"477eb3b8-9f75-4dd3-bc90-ec855d242dc8\") " pod="openshift-operators/perses-operator-54bc95c9fb-cq5g4" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.514882 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj7qb\" (UniqueName: \"kubernetes.io/projected/4334b8f1-99d4-4676-a64c-68704cfe50a8-kube-api-access-tj7qb\") pod \"observability-operator-cc5f78dfc-lhh2c\" (UID: \"4334b8f1-99d4-4676-a64c-68704cfe50a8\") " pod="openshift-operators/observability-operator-cc5f78dfc-lhh2c" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.518357 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/4334b8f1-99d4-4676-a64c-68704cfe50a8-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-lhh2c\" (UID: \"4334b8f1-99d4-4676-a64c-68704cfe50a8\") " pod="openshift-operators/observability-operator-cc5f78dfc-lhh2c" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.533505 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.546837 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj7qb\" (UniqueName: \"kubernetes.io/projected/4334b8f1-99d4-4676-a64c-68704cfe50a8-kube-api-access-tj7qb\") pod \"observability-operator-cc5f78dfc-lhh2c\" (UID: \"4334b8f1-99d4-4676-a64c-68704cfe50a8\") " pod="openshift-operators/observability-operator-cc5f78dfc-lhh2c" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.614674 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-lhh2c" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.615592 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/477eb3b8-9f75-4dd3-bc90-ec855d242dc8-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-cq5g4\" (UID: \"477eb3b8-9f75-4dd3-bc90-ec855d242dc8\") " pod="openshift-operators/perses-operator-54bc95c9fb-cq5g4" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.615636 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68mrt\" (UniqueName: \"kubernetes.io/projected/477eb3b8-9f75-4dd3-bc90-ec855d242dc8-kube-api-access-68mrt\") pod \"perses-operator-54bc95c9fb-cq5g4\" (UID: \"477eb3b8-9f75-4dd3-bc90-ec855d242dc8\") " pod="openshift-operators/perses-operator-54bc95c9fb-cq5g4" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.616788 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/477eb3b8-9f75-4dd3-bc90-ec855d242dc8-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-cq5g4\" (UID: \"477eb3b8-9f75-4dd3-bc90-ec855d242dc8\") " pod="openshift-operators/perses-operator-54bc95c9fb-cq5g4" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.668000 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68mrt\" (UniqueName: \"kubernetes.io/projected/477eb3b8-9f75-4dd3-bc90-ec855d242dc8-kube-api-access-68mrt\") pod \"perses-operator-54bc95c9fb-cq5g4\" (UID: \"477eb3b8-9f75-4dd3-bc90-ec855d242dc8\") " pod="openshift-operators/perses-operator-54bc95c9fb-cq5g4" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.678907 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-9x6c4"] Oct 03 18:25:20 crc kubenswrapper[4835]: W1003 18:25:20.738352 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0c975ce_2198_4163_b431_7bad685dab35.slice/crio-a225f0315a2bbf69c56c03ddf5b9790419a79ba39e925c1d46ac5436597d2154 WatchSource:0}: Error finding container a225f0315a2bbf69c56c03ddf5b9790419a79ba39e925c1d46ac5436597d2154: Status 404 returned error can't find the container with id a225f0315a2bbf69c56c03ddf5b9790419a79ba39e925c1d46ac5436597d2154 Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.769912 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-cq5g4" Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.871159 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t"] Oct 03 18:25:20 crc kubenswrapper[4835]: W1003 18:25:20.897804 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec28bb25_3e95_4d64_b5b2_7fecfa63db71.slice/crio-e8f8182149f08c8f34604a8c63d547f62ef8ed1648e1d212e4a0964a0f83f2f8 WatchSource:0}: Error finding container e8f8182149f08c8f34604a8c63d547f62ef8ed1648e1d212e4a0964a0f83f2f8: Status 404 returned error can't find the container with id e8f8182149f08c8f34604a8c63d547f62ef8ed1648e1d212e4a0964a0f83f2f8 Oct 03 18:25:20 crc kubenswrapper[4835]: I1003 18:25:20.932780 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x"] Oct 03 18:25:20 crc kubenswrapper[4835]: W1003 18:25:20.946954 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7d6c825_42e2_4396_b3c4_b93c3c2f9442.slice/crio-9626fbd9b20ec2530e7753151a94e46d218c81b7b23a932a0d81901b8e90ab3e WatchSource:0}: Error finding container 9626fbd9b20ec2530e7753151a94e46d218c81b7b23a932a0d81901b8e90ab3e: Status 404 returned error can't find the container with id 9626fbd9b20ec2530e7753151a94e46d218c81b7b23a932a0d81901b8e90ab3e Oct 03 18:25:21 crc kubenswrapper[4835]: I1003 18:25:20.995464 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-lhh2c"] Oct 03 18:25:21 crc kubenswrapper[4835]: I1003 18:25:21.052435 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-cq5g4"] Oct 03 18:25:21 crc kubenswrapper[4835]: I1003 18:25:21.105335 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-cq5g4" event={"ID":"477eb3b8-9f75-4dd3-bc90-ec855d242dc8","Type":"ContainerStarted","Data":"15ef90f160066cbbfa208be0486256a6680b8dfb1a67756a4094e53f3a2bfd76"} Oct 03 18:25:21 crc kubenswrapper[4835]: I1003 18:25:21.106434 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t" event={"ID":"ec28bb25-3e95-4d64-b5b2-7fecfa63db71","Type":"ContainerStarted","Data":"e8f8182149f08c8f34604a8c63d547f62ef8ed1648e1d212e4a0964a0f83f2f8"} Oct 03 18:25:21 crc kubenswrapper[4835]: I1003 18:25:21.107538 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-lhh2c" event={"ID":"4334b8f1-99d4-4676-a64c-68704cfe50a8","Type":"ContainerStarted","Data":"aeb659e447160d29c252922504b907664bc7f8efb00e8a1691ce7b03cb352799"} Oct 03 18:25:21 crc kubenswrapper[4835]: I1003 18:25:21.108445 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x" event={"ID":"f7d6c825-42e2-4396-b3c4-b93c3c2f9442","Type":"ContainerStarted","Data":"9626fbd9b20ec2530e7753151a94e46d218c81b7b23a932a0d81901b8e90ab3e"} Oct 03 18:25:21 crc kubenswrapper[4835]: I1003 18:25:21.109576 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-9x6c4" event={"ID":"d0c975ce-2198-4163-b431-7bad685dab35","Type":"ContainerStarted","Data":"a225f0315a2bbf69c56c03ddf5b9790419a79ba39e925c1d46ac5436597d2154"} Oct 03 18:25:37 crc kubenswrapper[4835]: E1003 18:25:37.286226 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e2681bce57dc9c15701f5591532c2dfe8f19778606661339553a28dc003dbca5" Oct 03 18:25:37 crc kubenswrapper[4835]: E1003 18:25:37.286864 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e2681bce57dc9c15701f5591532c2dfe8f19778606661339553a28dc003dbca5,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:8597c48fc71fc6ec8e87dbe40dace4dbb7b817c1039db608af76a0d90f7ac2d0,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gcrd8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-7c8cf85677-9x6c4_openshift-operators(d0c975ce-2198-4163-b431-7bad685dab35): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 18:25:37 crc kubenswrapper[4835]: E1003 18:25:37.288108 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-9x6c4" podUID="d0c975ce-2198-4163-b431-7bad685dab35" Oct 03 18:25:38 crc kubenswrapper[4835]: I1003 18:25:38.244204 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-cq5g4" event={"ID":"477eb3b8-9f75-4dd3-bc90-ec855d242dc8","Type":"ContainerStarted","Data":"143629a362ebf9c0e3135ebc75b562fd6a87187b3bef8a6f05347c7a51752a67"} Oct 03 18:25:38 crc kubenswrapper[4835]: I1003 18:25:38.244554 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-cq5g4" Oct 03 18:25:38 crc kubenswrapper[4835]: I1003 18:25:38.246705 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t" event={"ID":"ec28bb25-3e95-4d64-b5b2-7fecfa63db71","Type":"ContainerStarted","Data":"f93ffde32cb79256127fa48070253720a2f99eb86fb8b5bdfd0596c9ef385546"} Oct 03 18:25:38 crc kubenswrapper[4835]: I1003 18:25:38.248664 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-lhh2c" event={"ID":"4334b8f1-99d4-4676-a64c-68704cfe50a8","Type":"ContainerStarted","Data":"09e7985f4a5af8588f7d88df150c9eb1309920313c2959a0cd572b66cd15a1e4"} Oct 03 18:25:38 crc kubenswrapper[4835]: I1003 18:25:38.249244 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-lhh2c" Oct 03 18:25:38 crc kubenswrapper[4835]: I1003 18:25:38.251129 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x" event={"ID":"f7d6c825-42e2-4396-b3c4-b93c3c2f9442","Type":"ContainerStarted","Data":"19c037ef28bbbba168179700fd2bd932e8c79ccfd347dec803a7cc9f5c94418f"} Oct 03 18:25:38 crc kubenswrapper[4835]: E1003 18:25:38.253456 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e2681bce57dc9c15701f5591532c2dfe8f19778606661339553a28dc003dbca5\\\"\"" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-9x6c4" podUID="d0c975ce-2198-4163-b431-7bad685dab35" Oct 03 18:25:38 crc kubenswrapper[4835]: I1003 18:25:38.264728 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-cq5g4" podStartSLOduration=1.664369129 podStartE2EDuration="18.264709019s" podCreationTimestamp="2025-10-03 18:25:20 +0000 UTC" firstStartedPulling="2025-10-03 18:25:21.061998871 +0000 UTC m=+662.777939743" lastFinishedPulling="2025-10-03 18:25:37.662338771 +0000 UTC m=+679.378279633" observedRunningTime="2025-10-03 18:25:38.26279066 +0000 UTC m=+679.978731562" watchObservedRunningTime="2025-10-03 18:25:38.264709019 +0000 UTC m=+679.980649891" Oct 03 18:25:38 crc kubenswrapper[4835]: I1003 18:25:38.299456 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t" podStartSLOduration=1.5539679149999999 podStartE2EDuration="18.299438718s" podCreationTimestamp="2025-10-03 18:25:20 +0000 UTC" firstStartedPulling="2025-10-03 18:25:20.914727375 +0000 UTC m=+662.630668247" lastFinishedPulling="2025-10-03 18:25:37.660198178 +0000 UTC m=+679.376139050" observedRunningTime="2025-10-03 18:25:38.29792451 +0000 UTC m=+680.013865382" watchObservedRunningTime="2025-10-03 18:25:38.299438718 +0000 UTC m=+680.015379590" Oct 03 18:25:38 crc kubenswrapper[4835]: I1003 18:25:38.359404 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-lhh2c" podStartSLOduration=1.66764357 podStartE2EDuration="18.359381818s" podCreationTimestamp="2025-10-03 18:25:20 +0000 UTC" firstStartedPulling="2025-10-03 18:25:21.017176119 +0000 UTC m=+662.733116991" lastFinishedPulling="2025-10-03 18:25:37.708914367 +0000 UTC m=+679.424855239" observedRunningTime="2025-10-03 18:25:38.357273356 +0000 UTC m=+680.073214238" watchObservedRunningTime="2025-10-03 18:25:38.359381818 +0000 UTC m=+680.075322690" Oct 03 18:25:38 crc kubenswrapper[4835]: I1003 18:25:38.377670 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5474f66f9-2225x" podStartSLOduration=1.6644626420000002 podStartE2EDuration="18.377654876s" podCreationTimestamp="2025-10-03 18:25:20 +0000 UTC" firstStartedPulling="2025-10-03 18:25:20.949541457 +0000 UTC m=+662.665482329" lastFinishedPulling="2025-10-03 18:25:37.662733701 +0000 UTC m=+679.378674563" observedRunningTime="2025-10-03 18:25:38.376537018 +0000 UTC m=+680.092477890" watchObservedRunningTime="2025-10-03 18:25:38.377654876 +0000 UTC m=+680.093595748" Oct 03 18:25:38 crc kubenswrapper[4835]: I1003 18:25:38.441831 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-lhh2c" Oct 03 18:25:50 crc kubenswrapper[4835]: I1003 18:25:50.773102 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-cq5g4" Oct 03 18:25:53 crc kubenswrapper[4835]: I1003 18:25:53.321867 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-9x6c4" event={"ID":"d0c975ce-2198-4163-b431-7bad685dab35","Type":"ContainerStarted","Data":"cb44c80e3ece6ce426d765d2356134cd0252bc8e8a2e5ce65932564c60e582b8"} Oct 03 18:25:53 crc kubenswrapper[4835]: I1003 18:25:53.338332 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-9x6c4" podStartSLOduration=1.343281512 podStartE2EDuration="33.338309283s" podCreationTimestamp="2025-10-03 18:25:20 +0000 UTC" firstStartedPulling="2025-10-03 18:25:20.75590699 +0000 UTC m=+662.471847862" lastFinishedPulling="2025-10-03 18:25:52.750934761 +0000 UTC m=+694.466875633" observedRunningTime="2025-10-03 18:25:53.335724379 +0000 UTC m=+695.051665261" watchObservedRunningTime="2025-10-03 18:25:53.338309283 +0000 UTC m=+695.054250165" Oct 03 18:26:05 crc kubenswrapper[4835]: I1003 18:26:05.358855 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:26:05 crc kubenswrapper[4835]: I1003 18:26:05.359377 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:26:09 crc kubenswrapper[4835]: I1003 18:26:09.681419 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx"] Oct 03 18:26:09 crc kubenswrapper[4835]: I1003 18:26:09.682677 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" Oct 03 18:26:09 crc kubenswrapper[4835]: I1003 18:26:09.684285 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 18:26:09 crc kubenswrapper[4835]: I1003 18:26:09.693787 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx"] Oct 03 18:26:09 crc kubenswrapper[4835]: I1003 18:26:09.810682 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd852437-5bbf-421e-ba98-2a923677a63b-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx\" (UID: \"dd852437-5bbf-421e-ba98-2a923677a63b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" Oct 03 18:26:09 crc kubenswrapper[4835]: I1003 18:26:09.810758 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd852437-5bbf-421e-ba98-2a923677a63b-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx\" (UID: \"dd852437-5bbf-421e-ba98-2a923677a63b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" Oct 03 18:26:09 crc kubenswrapper[4835]: I1003 18:26:09.810824 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxlq9\" (UniqueName: \"kubernetes.io/projected/dd852437-5bbf-421e-ba98-2a923677a63b-kube-api-access-fxlq9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx\" (UID: \"dd852437-5bbf-421e-ba98-2a923677a63b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" Oct 03 18:26:09 crc kubenswrapper[4835]: I1003 18:26:09.912251 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd852437-5bbf-421e-ba98-2a923677a63b-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx\" (UID: \"dd852437-5bbf-421e-ba98-2a923677a63b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" Oct 03 18:26:09 crc kubenswrapper[4835]: I1003 18:26:09.912825 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd852437-5bbf-421e-ba98-2a923677a63b-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx\" (UID: \"dd852437-5bbf-421e-ba98-2a923677a63b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" Oct 03 18:26:09 crc kubenswrapper[4835]: I1003 18:26:09.913150 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd852437-5bbf-421e-ba98-2a923677a63b-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx\" (UID: \"dd852437-5bbf-421e-ba98-2a923677a63b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" Oct 03 18:26:09 crc kubenswrapper[4835]: I1003 18:26:09.913390 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd852437-5bbf-421e-ba98-2a923677a63b-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx\" (UID: \"dd852437-5bbf-421e-ba98-2a923677a63b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" Oct 03 18:26:09 crc kubenswrapper[4835]: I1003 18:26:09.913602 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxlq9\" (UniqueName: \"kubernetes.io/projected/dd852437-5bbf-421e-ba98-2a923677a63b-kube-api-access-fxlq9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx\" (UID: \"dd852437-5bbf-421e-ba98-2a923677a63b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" Oct 03 18:26:09 crc kubenswrapper[4835]: I1003 18:26:09.932160 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxlq9\" (UniqueName: \"kubernetes.io/projected/dd852437-5bbf-421e-ba98-2a923677a63b-kube-api-access-fxlq9\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx\" (UID: \"dd852437-5bbf-421e-ba98-2a923677a63b\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" Oct 03 18:26:10 crc kubenswrapper[4835]: I1003 18:26:10.000079 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" Oct 03 18:26:10 crc kubenswrapper[4835]: I1003 18:26:10.379632 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx"] Oct 03 18:26:10 crc kubenswrapper[4835]: I1003 18:26:10.401132 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" event={"ID":"dd852437-5bbf-421e-ba98-2a923677a63b","Type":"ContainerStarted","Data":"2bf8d9c3e9578d8505740ae54c9845523b956ae777c1f14689927b423b4f7cbd"} Oct 03 18:26:11 crc kubenswrapper[4835]: I1003 18:26:11.407032 4835 generic.go:334] "Generic (PLEG): container finished" podID="dd852437-5bbf-421e-ba98-2a923677a63b" containerID="e6e231e1a3538b6df3d5708a2f3627d18e75bb3efcfb6b5fcf321634656c67f2" exitCode=0 Oct 03 18:26:11 crc kubenswrapper[4835]: I1003 18:26:11.407160 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" event={"ID":"dd852437-5bbf-421e-ba98-2a923677a63b","Type":"ContainerDied","Data":"e6e231e1a3538b6df3d5708a2f3627d18e75bb3efcfb6b5fcf321634656c67f2"} Oct 03 18:26:14 crc kubenswrapper[4835]: I1003 18:26:14.423604 4835 generic.go:334] "Generic (PLEG): container finished" podID="dd852437-5bbf-421e-ba98-2a923677a63b" containerID="7920377ef49fda9b66644769dbaeea2fb801cc956fd89eacfc2faf2391de9d0d" exitCode=0 Oct 03 18:26:14 crc kubenswrapper[4835]: I1003 18:26:14.423648 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" event={"ID":"dd852437-5bbf-421e-ba98-2a923677a63b","Type":"ContainerDied","Data":"7920377ef49fda9b66644769dbaeea2fb801cc956fd89eacfc2faf2391de9d0d"} Oct 03 18:26:15 crc kubenswrapper[4835]: I1003 18:26:15.431142 4835 generic.go:334] "Generic (PLEG): container finished" podID="dd852437-5bbf-421e-ba98-2a923677a63b" containerID="4d9b53b618bdd78da803b82c8d1f1bcb823c00a8bc719beaebda47fc062fbe3d" exitCode=0 Oct 03 18:26:15 crc kubenswrapper[4835]: I1003 18:26:15.431193 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" event={"ID":"dd852437-5bbf-421e-ba98-2a923677a63b","Type":"ContainerDied","Data":"4d9b53b618bdd78da803b82c8d1f1bcb823c00a8bc719beaebda47fc062fbe3d"} Oct 03 18:26:16 crc kubenswrapper[4835]: I1003 18:26:16.636568 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" Oct 03 18:26:16 crc kubenswrapper[4835]: I1003 18:26:16.791837 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd852437-5bbf-421e-ba98-2a923677a63b-bundle\") pod \"dd852437-5bbf-421e-ba98-2a923677a63b\" (UID: \"dd852437-5bbf-421e-ba98-2a923677a63b\") " Oct 03 18:26:16 crc kubenswrapper[4835]: I1003 18:26:16.791922 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd852437-5bbf-421e-ba98-2a923677a63b-util\") pod \"dd852437-5bbf-421e-ba98-2a923677a63b\" (UID: \"dd852437-5bbf-421e-ba98-2a923677a63b\") " Oct 03 18:26:16 crc kubenswrapper[4835]: I1003 18:26:16.791993 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxlq9\" (UniqueName: \"kubernetes.io/projected/dd852437-5bbf-421e-ba98-2a923677a63b-kube-api-access-fxlq9\") pod \"dd852437-5bbf-421e-ba98-2a923677a63b\" (UID: \"dd852437-5bbf-421e-ba98-2a923677a63b\") " Oct 03 18:26:16 crc kubenswrapper[4835]: I1003 18:26:16.793156 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd852437-5bbf-421e-ba98-2a923677a63b-bundle" (OuterVolumeSpecName: "bundle") pod "dd852437-5bbf-421e-ba98-2a923677a63b" (UID: "dd852437-5bbf-421e-ba98-2a923677a63b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:26:16 crc kubenswrapper[4835]: I1003 18:26:16.797095 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd852437-5bbf-421e-ba98-2a923677a63b-kube-api-access-fxlq9" (OuterVolumeSpecName: "kube-api-access-fxlq9") pod "dd852437-5bbf-421e-ba98-2a923677a63b" (UID: "dd852437-5bbf-421e-ba98-2a923677a63b"). InnerVolumeSpecName "kube-api-access-fxlq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:26:16 crc kubenswrapper[4835]: I1003 18:26:16.893316 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxlq9\" (UniqueName: \"kubernetes.io/projected/dd852437-5bbf-421e-ba98-2a923677a63b-kube-api-access-fxlq9\") on node \"crc\" DevicePath \"\"" Oct 03 18:26:16 crc kubenswrapper[4835]: I1003 18:26:16.893346 4835 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd852437-5bbf-421e-ba98-2a923677a63b-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:26:17 crc kubenswrapper[4835]: I1003 18:26:17.048606 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd852437-5bbf-421e-ba98-2a923677a63b-util" (OuterVolumeSpecName: "util") pod "dd852437-5bbf-421e-ba98-2a923677a63b" (UID: "dd852437-5bbf-421e-ba98-2a923677a63b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:26:17 crc kubenswrapper[4835]: I1003 18:26:17.096273 4835 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd852437-5bbf-421e-ba98-2a923677a63b-util\") on node \"crc\" DevicePath \"\"" Oct 03 18:26:17 crc kubenswrapper[4835]: I1003 18:26:17.442796 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" event={"ID":"dd852437-5bbf-421e-ba98-2a923677a63b","Type":"ContainerDied","Data":"2bf8d9c3e9578d8505740ae54c9845523b956ae777c1f14689927b423b4f7cbd"} Oct 03 18:26:17 crc kubenswrapper[4835]: I1003 18:26:17.443116 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bf8d9c3e9578d8505740ae54c9845523b956ae777c1f14689927b423b4f7cbd" Oct 03 18:26:17 crc kubenswrapper[4835]: I1003 18:26:17.442875 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.024720 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-nphxp"] Oct 03 18:26:21 crc kubenswrapper[4835]: E1003 18:26:21.025256 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd852437-5bbf-421e-ba98-2a923677a63b" containerName="util" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.025270 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd852437-5bbf-421e-ba98-2a923677a63b" containerName="util" Oct 03 18:26:21 crc kubenswrapper[4835]: E1003 18:26:21.025279 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd852437-5bbf-421e-ba98-2a923677a63b" containerName="extract" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.025285 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd852437-5bbf-421e-ba98-2a923677a63b" containerName="extract" Oct 03 18:26:21 crc kubenswrapper[4835]: E1003 18:26:21.025296 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd852437-5bbf-421e-ba98-2a923677a63b" containerName="pull" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.025301 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd852437-5bbf-421e-ba98-2a923677a63b" containerName="pull" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.025392 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd852437-5bbf-421e-ba98-2a923677a63b" containerName="extract" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.025762 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-nphxp" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.027944 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-h2s45" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.028831 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.030371 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.042252 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-nphxp"] Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.044053 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dh58\" (UniqueName: \"kubernetes.io/projected/6278a01c-35aa-4828-a8b8-2bc36e31b756-kube-api-access-8dh58\") pod \"nmstate-operator-858ddd8f98-nphxp\" (UID: \"6278a01c-35aa-4828-a8b8-2bc36e31b756\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-nphxp" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.144794 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dh58\" (UniqueName: \"kubernetes.io/projected/6278a01c-35aa-4828-a8b8-2bc36e31b756-kube-api-access-8dh58\") pod \"nmstate-operator-858ddd8f98-nphxp\" (UID: \"6278a01c-35aa-4828-a8b8-2bc36e31b756\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-nphxp" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.162169 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dh58\" (UniqueName: \"kubernetes.io/projected/6278a01c-35aa-4828-a8b8-2bc36e31b756-kube-api-access-8dh58\") pod \"nmstate-operator-858ddd8f98-nphxp\" (UID: \"6278a01c-35aa-4828-a8b8-2bc36e31b756\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-nphxp" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.340985 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-nphxp" Oct 03 18:26:21 crc kubenswrapper[4835]: I1003 18:26:21.721194 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-nphxp"] Oct 03 18:26:21 crc kubenswrapper[4835]: W1003 18:26:21.728332 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6278a01c_35aa_4828_a8b8_2bc36e31b756.slice/crio-f84b29dc0055837fdbc4d080364af2ed19bd4f7700844163f01fc05b719b88ea WatchSource:0}: Error finding container f84b29dc0055837fdbc4d080364af2ed19bd4f7700844163f01fc05b719b88ea: Status 404 returned error can't find the container with id f84b29dc0055837fdbc4d080364af2ed19bd4f7700844163f01fc05b719b88ea Oct 03 18:26:22 crc kubenswrapper[4835]: I1003 18:26:22.467208 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-nphxp" event={"ID":"6278a01c-35aa-4828-a8b8-2bc36e31b756","Type":"ContainerStarted","Data":"f84b29dc0055837fdbc4d080364af2ed19bd4f7700844163f01fc05b719b88ea"} Oct 03 18:26:24 crc kubenswrapper[4835]: I1003 18:26:24.479706 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-nphxp" event={"ID":"6278a01c-35aa-4828-a8b8-2bc36e31b756","Type":"ContainerStarted","Data":"020fa9a1f56384bb5b462294867b73c60bd766e7d851d01acc1225f19193643a"} Oct 03 18:26:24 crc kubenswrapper[4835]: I1003 18:26:24.497981 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-nphxp" podStartSLOduration=1.333693584 podStartE2EDuration="3.497958354s" podCreationTimestamp="2025-10-03 18:26:21 +0000 UTC" firstStartedPulling="2025-10-03 18:26:21.731225474 +0000 UTC m=+723.447166346" lastFinishedPulling="2025-10-03 18:26:23.895490244 +0000 UTC m=+725.611431116" observedRunningTime="2025-10-03 18:26:24.495157484 +0000 UTC m=+726.211098356" watchObservedRunningTime="2025-10-03 18:26:24.497958354 +0000 UTC m=+726.213899226" Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.835554 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-4br2m"] Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.838692 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4br2m" Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.841133 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-n7k9n" Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.850959 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-4br2m"] Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.854437 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g"] Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.855143 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.857555 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.868148 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-q7ccg"] Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.869036 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.871607 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g"] Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.966774 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrbhw\" (UniqueName: \"kubernetes.io/projected/4e2a2a3c-af08-4f6e-88d7-e42c6327d83a-kube-api-access-vrbhw\") pod \"nmstate-metrics-fdff9cb8d-4br2m\" (UID: \"4e2a2a3c-af08-4f6e-88d7-e42c6327d83a\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4br2m" Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.966834 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/51b41f85-9699-4216-857b-1d79a2cbc755-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-t9j4g\" (UID: \"51b41f85-9699-4216-857b-1d79a2cbc755\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.966896 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwfrr\" (UniqueName: \"kubernetes.io/projected/51b41f85-9699-4216-857b-1d79a2cbc755-kube-api-access-qwfrr\") pod \"nmstate-webhook-6cdbc54649-t9j4g\" (UID: \"51b41f85-9699-4216-857b-1d79a2cbc755\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.967880 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g"] Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.968769 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.970664 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-q8vnr" Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.970902 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 03 18:26:30 crc kubenswrapper[4835]: I1003 18:26:30.971318 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.010565 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g"] Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.068244 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrbhw\" (UniqueName: \"kubernetes.io/projected/4e2a2a3c-af08-4f6e-88d7-e42c6327d83a-kube-api-access-vrbhw\") pod \"nmstate-metrics-fdff9cb8d-4br2m\" (UID: \"4e2a2a3c-af08-4f6e-88d7-e42c6327d83a\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4br2m" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.068805 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/51b41f85-9699-4216-857b-1d79a2cbc755-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-t9j4g\" (UID: \"51b41f85-9699-4216-857b-1d79a2cbc755\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.068853 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/efa8995d-566a-4681-8a8d-04c75bb2e5ff-ovs-socket\") pod \"nmstate-handler-q7ccg\" (UID: \"efa8995d-566a-4681-8a8d-04c75bb2e5ff\") " pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.068880 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/efa8995d-566a-4681-8a8d-04c75bb2e5ff-nmstate-lock\") pod \"nmstate-handler-q7ccg\" (UID: \"efa8995d-566a-4681-8a8d-04c75bb2e5ff\") " pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.068910 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/efa8995d-566a-4681-8a8d-04c75bb2e5ff-dbus-socket\") pod \"nmstate-handler-q7ccg\" (UID: \"efa8995d-566a-4681-8a8d-04c75bb2e5ff\") " pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.068943 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7fz9\" (UniqueName: \"kubernetes.io/projected/efa8995d-566a-4681-8a8d-04c75bb2e5ff-kube-api-access-n7fz9\") pod \"nmstate-handler-q7ccg\" (UID: \"efa8995d-566a-4681-8a8d-04c75bb2e5ff\") " pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:31 crc kubenswrapper[4835]: E1003 18:26:31.069093 4835 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 03 18:26:31 crc kubenswrapper[4835]: E1003 18:26:31.069157 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51b41f85-9699-4216-857b-1d79a2cbc755-tls-key-pair podName:51b41f85-9699-4216-857b-1d79a2cbc755 nodeName:}" failed. No retries permitted until 2025-10-03 18:26:31.569141148 +0000 UTC m=+733.285082020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/51b41f85-9699-4216-857b-1d79a2cbc755-tls-key-pair") pod "nmstate-webhook-6cdbc54649-t9j4g" (UID: "51b41f85-9699-4216-857b-1d79a2cbc755") : secret "openshift-nmstate-webhook" not found Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.069334 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwfrr\" (UniqueName: \"kubernetes.io/projected/51b41f85-9699-4216-857b-1d79a2cbc755-kube-api-access-qwfrr\") pod \"nmstate-webhook-6cdbc54649-t9j4g\" (UID: \"51b41f85-9699-4216-857b-1d79a2cbc755\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.088723 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwfrr\" (UniqueName: \"kubernetes.io/projected/51b41f85-9699-4216-857b-1d79a2cbc755-kube-api-access-qwfrr\") pod \"nmstate-webhook-6cdbc54649-t9j4g\" (UID: \"51b41f85-9699-4216-857b-1d79a2cbc755\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.088781 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrbhw\" (UniqueName: \"kubernetes.io/projected/4e2a2a3c-af08-4f6e-88d7-e42c6327d83a-kube-api-access-vrbhw\") pod \"nmstate-metrics-fdff9cb8d-4br2m\" (UID: \"4e2a2a3c-af08-4f6e-88d7-e42c6327d83a\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4br2m" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.154698 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4br2m" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.170690 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/394f1e44-03f0-460e-be0b-a526690916d4-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-vpn8g\" (UID: \"394f1e44-03f0-460e-be0b-a526690916d4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.170737 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sbvm\" (UniqueName: \"kubernetes.io/projected/394f1e44-03f0-460e-be0b-a526690916d4-kube-api-access-4sbvm\") pod \"nmstate-console-plugin-6b874cbd85-vpn8g\" (UID: \"394f1e44-03f0-460e-be0b-a526690916d4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.170847 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/394f1e44-03f0-460e-be0b-a526690916d4-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-vpn8g\" (UID: \"394f1e44-03f0-460e-be0b-a526690916d4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.170875 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/efa8995d-566a-4681-8a8d-04c75bb2e5ff-ovs-socket\") pod \"nmstate-handler-q7ccg\" (UID: \"efa8995d-566a-4681-8a8d-04c75bb2e5ff\") " pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.170899 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/efa8995d-566a-4681-8a8d-04c75bb2e5ff-dbus-socket\") pod \"nmstate-handler-q7ccg\" (UID: \"efa8995d-566a-4681-8a8d-04c75bb2e5ff\") " pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.170919 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/efa8995d-566a-4681-8a8d-04c75bb2e5ff-nmstate-lock\") pod \"nmstate-handler-q7ccg\" (UID: \"efa8995d-566a-4681-8a8d-04c75bb2e5ff\") " pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.170946 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7fz9\" (UniqueName: \"kubernetes.io/projected/efa8995d-566a-4681-8a8d-04c75bb2e5ff-kube-api-access-n7fz9\") pod \"nmstate-handler-q7ccg\" (UID: \"efa8995d-566a-4681-8a8d-04c75bb2e5ff\") " pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.171239 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/efa8995d-566a-4681-8a8d-04c75bb2e5ff-dbus-socket\") pod \"nmstate-handler-q7ccg\" (UID: \"efa8995d-566a-4681-8a8d-04c75bb2e5ff\") " pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.171329 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/efa8995d-566a-4681-8a8d-04c75bb2e5ff-ovs-socket\") pod \"nmstate-handler-q7ccg\" (UID: \"efa8995d-566a-4681-8a8d-04c75bb2e5ff\") " pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.171677 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/efa8995d-566a-4681-8a8d-04c75bb2e5ff-nmstate-lock\") pod \"nmstate-handler-q7ccg\" (UID: \"efa8995d-566a-4681-8a8d-04c75bb2e5ff\") " pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.183676 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-948c4f479-lsv7v"] Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.184392 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.196569 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7fz9\" (UniqueName: \"kubernetes.io/projected/efa8995d-566a-4681-8a8d-04c75bb2e5ff-kube-api-access-n7fz9\") pod \"nmstate-handler-q7ccg\" (UID: \"efa8995d-566a-4681-8a8d-04c75bb2e5ff\") " pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.203826 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-948c4f479-lsv7v"] Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.271945 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/394f1e44-03f0-460e-be0b-a526690916d4-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-vpn8g\" (UID: \"394f1e44-03f0-460e-be0b-a526690916d4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.272294 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sbvm\" (UniqueName: \"kubernetes.io/projected/394f1e44-03f0-460e-be0b-a526690916d4-kube-api-access-4sbvm\") pod \"nmstate-console-plugin-6b874cbd85-vpn8g\" (UID: \"394f1e44-03f0-460e-be0b-a526690916d4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.272364 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/394f1e44-03f0-460e-be0b-a526690916d4-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-vpn8g\" (UID: \"394f1e44-03f0-460e-be0b-a526690916d4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.273865 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/394f1e44-03f0-460e-be0b-a526690916d4-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-vpn8g\" (UID: \"394f1e44-03f0-460e-be0b-a526690916d4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.277515 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/394f1e44-03f0-460e-be0b-a526690916d4-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-vpn8g\" (UID: \"394f1e44-03f0-460e-be0b-a526690916d4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.292088 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sbvm\" (UniqueName: \"kubernetes.io/projected/394f1e44-03f0-460e-be0b-a526690916d4-kube-api-access-4sbvm\") pod \"nmstate-console-plugin-6b874cbd85-vpn8g\" (UID: \"394f1e44-03f0-460e-be0b-a526690916d4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.373172 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d56061eb-bb0a-4dd1-96f6-4244a4e15795-service-ca\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.373236 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgbtx\" (UniqueName: \"kubernetes.io/projected/d56061eb-bb0a-4dd1-96f6-4244a4e15795-kube-api-access-mgbtx\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.373271 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d56061eb-bb0a-4dd1-96f6-4244a4e15795-console-serving-cert\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.373383 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d56061eb-bb0a-4dd1-96f6-4244a4e15795-console-config\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.373463 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d56061eb-bb0a-4dd1-96f6-4244a4e15795-console-oauth-config\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.373520 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d56061eb-bb0a-4dd1-96f6-4244a4e15795-oauth-serving-cert\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.373548 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d56061eb-bb0a-4dd1-96f6-4244a4e15795-trusted-ca-bundle\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.375448 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-4br2m"] Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.475111 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d56061eb-bb0a-4dd1-96f6-4244a4e15795-oauth-serving-cert\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.475175 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d56061eb-bb0a-4dd1-96f6-4244a4e15795-trusted-ca-bundle\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.475207 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d56061eb-bb0a-4dd1-96f6-4244a4e15795-service-ca\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.475236 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgbtx\" (UniqueName: \"kubernetes.io/projected/d56061eb-bb0a-4dd1-96f6-4244a4e15795-kube-api-access-mgbtx\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.475257 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d56061eb-bb0a-4dd1-96f6-4244a4e15795-console-serving-cert\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.475286 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d56061eb-bb0a-4dd1-96f6-4244a4e15795-console-config\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.475305 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d56061eb-bb0a-4dd1-96f6-4244a4e15795-console-oauth-config\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.476763 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d56061eb-bb0a-4dd1-96f6-4244a4e15795-service-ca\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.476892 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d56061eb-bb0a-4dd1-96f6-4244a4e15795-oauth-serving-cert\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.476912 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d56061eb-bb0a-4dd1-96f6-4244a4e15795-console-config\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.477644 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d56061eb-bb0a-4dd1-96f6-4244a4e15795-trusted-ca-bundle\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.480018 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d56061eb-bb0a-4dd1-96f6-4244a4e15795-console-oauth-config\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.480531 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d56061eb-bb0a-4dd1-96f6-4244a4e15795-console-serving-cert\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.490691 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgbtx\" (UniqueName: \"kubernetes.io/projected/d56061eb-bb0a-4dd1-96f6-4244a4e15795-kube-api-access-mgbtx\") pod \"console-948c4f479-lsv7v\" (UID: \"d56061eb-bb0a-4dd1-96f6-4244a4e15795\") " pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.495552 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.517969 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4br2m" event={"ID":"4e2a2a3c-af08-4f6e-88d7-e42c6327d83a","Type":"ContainerStarted","Data":"fea23f6992865e5b68c0332574208cba66440b61e1fc1afe0c07154bcf8820b5"} Oct 03 18:26:31 crc kubenswrapper[4835]: W1003 18:26:31.518224 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefa8995d_566a_4681_8a8d_04c75bb2e5ff.slice/crio-0a1ca882f407844ce6797c98eb6b12c8ee6f91f34b05f9328bd15dbfee0d373d WatchSource:0}: Error finding container 0a1ca882f407844ce6797c98eb6b12c8ee6f91f34b05f9328bd15dbfee0d373d: Status 404 returned error can't find the container with id 0a1ca882f407844ce6797c98eb6b12c8ee6f91f34b05f9328bd15dbfee0d373d Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.527365 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.577178 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/51b41f85-9699-4216-857b-1d79a2cbc755-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-t9j4g\" (UID: \"51b41f85-9699-4216-857b-1d79a2cbc755\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.580662 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/51b41f85-9699-4216-857b-1d79a2cbc755-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-t9j4g\" (UID: \"51b41f85-9699-4216-857b-1d79a2cbc755\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.584438 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.703576 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-948c4f479-lsv7v"] Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.767091 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g"] Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.769098 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" Oct 03 18:26:31 crc kubenswrapper[4835]: W1003 18:26:31.773542 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod394f1e44_03f0_460e_be0b_a526690916d4.slice/crio-27e326c1f97b820d18a30aaf626ebae5cbd57cad64ba2e375aa21798a3c9be3d WatchSource:0}: Error finding container 27e326c1f97b820d18a30aaf626ebae5cbd57cad64ba2e375aa21798a3c9be3d: Status 404 returned error can't find the container with id 27e326c1f97b820d18a30aaf626ebae5cbd57cad64ba2e375aa21798a3c9be3d Oct 03 18:26:31 crc kubenswrapper[4835]: I1003 18:26:31.926663 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g"] Oct 03 18:26:31 crc kubenswrapper[4835]: W1003 18:26:31.934298 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51b41f85_9699_4216_857b_1d79a2cbc755.slice/crio-ec5c410912205c4df5d8594f3c75e2c987322a769ce29432c9dcee072626aa2a WatchSource:0}: Error finding container ec5c410912205c4df5d8594f3c75e2c987322a769ce29432c9dcee072626aa2a: Status 404 returned error can't find the container with id ec5c410912205c4df5d8594f3c75e2c987322a769ce29432c9dcee072626aa2a Oct 03 18:26:32 crc kubenswrapper[4835]: I1003 18:26:32.524888 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" event={"ID":"394f1e44-03f0-460e-be0b-a526690916d4","Type":"ContainerStarted","Data":"27e326c1f97b820d18a30aaf626ebae5cbd57cad64ba2e375aa21798a3c9be3d"} Oct 03 18:26:32 crc kubenswrapper[4835]: I1003 18:26:32.526035 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" event={"ID":"51b41f85-9699-4216-857b-1d79a2cbc755","Type":"ContainerStarted","Data":"ec5c410912205c4df5d8594f3c75e2c987322a769ce29432c9dcee072626aa2a"} Oct 03 18:26:32 crc kubenswrapper[4835]: I1003 18:26:32.527657 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-948c4f479-lsv7v" event={"ID":"d56061eb-bb0a-4dd1-96f6-4244a4e15795","Type":"ContainerStarted","Data":"b8909b4693272733d3fed06e3a018270dc8df6f4636a3f6075e4a0ef88e1bbfa"} Oct 03 18:26:32 crc kubenswrapper[4835]: I1003 18:26:32.527694 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-948c4f479-lsv7v" event={"ID":"d56061eb-bb0a-4dd1-96f6-4244a4e15795","Type":"ContainerStarted","Data":"ab1c76841b3fd7c0226f9cf9cc2ee77936bd7aabb0207b28647cf69385c99c40"} Oct 03 18:26:32 crc kubenswrapper[4835]: I1003 18:26:32.528593 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-q7ccg" event={"ID":"efa8995d-566a-4681-8a8d-04c75bb2e5ff","Type":"ContainerStarted","Data":"0a1ca882f407844ce6797c98eb6b12c8ee6f91f34b05f9328bd15dbfee0d373d"} Oct 03 18:26:32 crc kubenswrapper[4835]: I1003 18:26:32.544344 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-948c4f479-lsv7v" podStartSLOduration=1.5443217790000001 podStartE2EDuration="1.544321779s" podCreationTimestamp="2025-10-03 18:26:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:26:32.542031271 +0000 UTC m=+734.257972143" watchObservedRunningTime="2025-10-03 18:26:32.544321779 +0000 UTC m=+734.260262651" Oct 03 18:26:34 crc kubenswrapper[4835]: I1003 18:26:34.541374 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-q7ccg" event={"ID":"efa8995d-566a-4681-8a8d-04c75bb2e5ff","Type":"ContainerStarted","Data":"acb80ae31ab0075e6dc572c7c0d552e28f0f2e54dc55d8c6fe7f33259269479a"} Oct 03 18:26:34 crc kubenswrapper[4835]: I1003 18:26:34.541757 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:34 crc kubenswrapper[4835]: I1003 18:26:34.544948 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4br2m" event={"ID":"4e2a2a3c-af08-4f6e-88d7-e42c6327d83a","Type":"ContainerStarted","Data":"28c021ba74838c1e6c2b89f62f39435bc31b24356e51057a89df5fb5e7508117"} Oct 03 18:26:34 crc kubenswrapper[4835]: I1003 18:26:34.546674 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" event={"ID":"394f1e44-03f0-460e-be0b-a526690916d4","Type":"ContainerStarted","Data":"90311e0fd681c417d5139ce2053279d69dc056955931991ab79850bc96d6599c"} Oct 03 18:26:34 crc kubenswrapper[4835]: I1003 18:26:34.549888 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" event={"ID":"51b41f85-9699-4216-857b-1d79a2cbc755","Type":"ContainerStarted","Data":"79cf04ec8e2a640518363a24cf7301ea1f42381afc9aacd6098d469a81f5efae"} Oct 03 18:26:34 crc kubenswrapper[4835]: I1003 18:26:34.550023 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" Oct 03 18:26:34 crc kubenswrapper[4835]: I1003 18:26:34.555994 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-q7ccg" podStartSLOduration=1.753023848 podStartE2EDuration="4.555976533s" podCreationTimestamp="2025-10-03 18:26:30 +0000 UTC" firstStartedPulling="2025-10-03 18:26:31.520627191 +0000 UTC m=+733.236568063" lastFinishedPulling="2025-10-03 18:26:34.323579876 +0000 UTC m=+736.039520748" observedRunningTime="2025-10-03 18:26:34.554978458 +0000 UTC m=+736.270919340" watchObservedRunningTime="2025-10-03 18:26:34.555976533 +0000 UTC m=+736.271917405" Oct 03 18:26:34 crc kubenswrapper[4835]: I1003 18:26:34.567921 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-vpn8g" podStartSLOduration=2.025976467 podStartE2EDuration="4.56790433s" podCreationTimestamp="2025-10-03 18:26:30 +0000 UTC" firstStartedPulling="2025-10-03 18:26:31.781654263 +0000 UTC m=+733.497595125" lastFinishedPulling="2025-10-03 18:26:34.323582116 +0000 UTC m=+736.039522988" observedRunningTime="2025-10-03 18:26:34.56745718 +0000 UTC m=+736.283398052" watchObservedRunningTime="2025-10-03 18:26:34.56790433 +0000 UTC m=+736.283845212" Oct 03 18:26:34 crc kubenswrapper[4835]: I1003 18:26:34.579484 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" podStartSLOduration=2.190341108 podStartE2EDuration="4.579465178s" podCreationTimestamp="2025-10-03 18:26:30 +0000 UTC" firstStartedPulling="2025-10-03 18:26:31.936788973 +0000 UTC m=+733.652729855" lastFinishedPulling="2025-10-03 18:26:34.325913053 +0000 UTC m=+736.041853925" observedRunningTime="2025-10-03 18:26:34.579358696 +0000 UTC m=+736.295299568" watchObservedRunningTime="2025-10-03 18:26:34.579465178 +0000 UTC m=+736.295406050" Oct 03 18:26:35 crc kubenswrapper[4835]: I1003 18:26:35.358701 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:26:35 crc kubenswrapper[4835]: I1003 18:26:35.359120 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:26:37 crc kubenswrapper[4835]: I1003 18:26:37.567602 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4br2m" event={"ID":"4e2a2a3c-af08-4f6e-88d7-e42c6327d83a","Type":"ContainerStarted","Data":"1d540a0de2ab8b73d7613ec8415fee5e93e532abbc41662101d640099b448cda"} Oct 03 18:26:37 crc kubenswrapper[4835]: I1003 18:26:37.582900 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-4br2m" podStartSLOduration=2.25171239 podStartE2EDuration="7.582882893s" podCreationTimestamp="2025-10-03 18:26:30 +0000 UTC" firstStartedPulling="2025-10-03 18:26:31.382215259 +0000 UTC m=+733.098156131" lastFinishedPulling="2025-10-03 18:26:36.713385772 +0000 UTC m=+738.429326634" observedRunningTime="2025-10-03 18:26:37.579622582 +0000 UTC m=+739.295563474" watchObservedRunningTime="2025-10-03 18:26:37.582882893 +0000 UTC m=+739.298823765" Oct 03 18:26:41 crc kubenswrapper[4835]: I1003 18:26:41.521719 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-q7ccg" Oct 03 18:26:41 crc kubenswrapper[4835]: I1003 18:26:41.528384 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:41 crc kubenswrapper[4835]: I1003 18:26:41.528459 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:41 crc kubenswrapper[4835]: I1003 18:26:41.533854 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:41 crc kubenswrapper[4835]: I1003 18:26:41.592991 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-948c4f479-lsv7v" Oct 03 18:26:41 crc kubenswrapper[4835]: I1003 18:26:41.638125 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8nwfg"] Oct 03 18:26:42 crc kubenswrapper[4835]: I1003 18:26:42.962771 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4tmrr"] Oct 03 18:26:42 crc kubenswrapper[4835]: I1003 18:26:42.963053 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" podUID="13e2e684-1dc3-4ea7-89a9-05dabb52b7f0" containerName="controller-manager" containerID="cri-o://41be3e5e4a4a9efaef0d63d3e11b3e787f6621e572bb8912680fb5fb4f426bc7" gracePeriod=30 Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.081439 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m"] Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.082626 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" podUID="6b59d837-ca72-447d-8b77-42675b0ec49b" containerName="route-controller-manager" containerID="cri-o://26d88cf13570a96d11c043b5b67c91bf7a957348047ab1d25b9e47af2820fc99" gracePeriod=30 Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.316673 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.420206 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.425293 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-config\") pod \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.425386 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-proxy-ca-bundles\") pod \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.425426 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-serving-cert\") pod \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.425450 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9nwp\" (UniqueName: \"kubernetes.io/projected/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-kube-api-access-r9nwp\") pod \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.425498 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-client-ca\") pod \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\" (UID: \"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0\") " Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.426446 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-config" (OuterVolumeSpecName: "config") pod "13e2e684-1dc3-4ea7-89a9-05dabb52b7f0" (UID: "13e2e684-1dc3-4ea7-89a9-05dabb52b7f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.426550 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-client-ca" (OuterVolumeSpecName: "client-ca") pod "13e2e684-1dc3-4ea7-89a9-05dabb52b7f0" (UID: "13e2e684-1dc3-4ea7-89a9-05dabb52b7f0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.426600 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "13e2e684-1dc3-4ea7-89a9-05dabb52b7f0" (UID: "13e2e684-1dc3-4ea7-89a9-05dabb52b7f0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.431372 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "13e2e684-1dc3-4ea7-89a9-05dabb52b7f0" (UID: "13e2e684-1dc3-4ea7-89a9-05dabb52b7f0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.432476 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-kube-api-access-r9nwp" (OuterVolumeSpecName: "kube-api-access-r9nwp") pod "13e2e684-1dc3-4ea7-89a9-05dabb52b7f0" (UID: "13e2e684-1dc3-4ea7-89a9-05dabb52b7f0"). InnerVolumeSpecName "kube-api-access-r9nwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.527247 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b59d837-ca72-447d-8b77-42675b0ec49b-client-ca\") pod \"6b59d837-ca72-447d-8b77-42675b0ec49b\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.527324 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b59d837-ca72-447d-8b77-42675b0ec49b-config\") pod \"6b59d837-ca72-447d-8b77-42675b0ec49b\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.527442 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b59d837-ca72-447d-8b77-42675b0ec49b-serving-cert\") pod \"6b59d837-ca72-447d-8b77-42675b0ec49b\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.527505 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc8c9\" (UniqueName: \"kubernetes.io/projected/6b59d837-ca72-447d-8b77-42675b0ec49b-kube-api-access-tc8c9\") pod \"6b59d837-ca72-447d-8b77-42675b0ec49b\" (UID: \"6b59d837-ca72-447d-8b77-42675b0ec49b\") " Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.527771 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.527795 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.527810 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.527822 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9nwp\" (UniqueName: \"kubernetes.io/projected/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-kube-api-access-r9nwp\") on node \"crc\" DevicePath \"\"" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.527832 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.528204 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b59d837-ca72-447d-8b77-42675b0ec49b-client-ca" (OuterVolumeSpecName: "client-ca") pod "6b59d837-ca72-447d-8b77-42675b0ec49b" (UID: "6b59d837-ca72-447d-8b77-42675b0ec49b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.528267 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b59d837-ca72-447d-8b77-42675b0ec49b-config" (OuterVolumeSpecName: "config") pod "6b59d837-ca72-447d-8b77-42675b0ec49b" (UID: "6b59d837-ca72-447d-8b77-42675b0ec49b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.531692 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b59d837-ca72-447d-8b77-42675b0ec49b-kube-api-access-tc8c9" (OuterVolumeSpecName: "kube-api-access-tc8c9") pod "6b59d837-ca72-447d-8b77-42675b0ec49b" (UID: "6b59d837-ca72-447d-8b77-42675b0ec49b"). InnerVolumeSpecName "kube-api-access-tc8c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.531706 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b59d837-ca72-447d-8b77-42675b0ec49b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6b59d837-ca72-447d-8b77-42675b0ec49b" (UID: "6b59d837-ca72-447d-8b77-42675b0ec49b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.601161 4835 generic.go:334] "Generic (PLEG): container finished" podID="13e2e684-1dc3-4ea7-89a9-05dabb52b7f0" containerID="41be3e5e4a4a9efaef0d63d3e11b3e787f6621e572bb8912680fb5fb4f426bc7" exitCode=0 Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.601221 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" event={"ID":"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0","Type":"ContainerDied","Data":"41be3e5e4a4a9efaef0d63d3e11b3e787f6621e572bb8912680fb5fb4f426bc7"} Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.601227 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.601248 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4tmrr" event={"ID":"13e2e684-1dc3-4ea7-89a9-05dabb52b7f0","Type":"ContainerDied","Data":"909cab64f19c4d7bd96b8881c6133c3f09fedab3827d09c084d01927f5a335e8"} Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.601265 4835 scope.go:117] "RemoveContainer" containerID="41be3e5e4a4a9efaef0d63d3e11b3e787f6621e572bb8912680fb5fb4f426bc7" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.603255 4835 generic.go:334] "Generic (PLEG): container finished" podID="6b59d837-ca72-447d-8b77-42675b0ec49b" containerID="26d88cf13570a96d11c043b5b67c91bf7a957348047ab1d25b9e47af2820fc99" exitCode=0 Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.603307 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" event={"ID":"6b59d837-ca72-447d-8b77-42675b0ec49b","Type":"ContainerDied","Data":"26d88cf13570a96d11c043b5b67c91bf7a957348047ab1d25b9e47af2820fc99"} Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.603335 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.603340 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m" event={"ID":"6b59d837-ca72-447d-8b77-42675b0ec49b","Type":"ContainerDied","Data":"7577de618cd02bd4bf6639a18bd116fe42cab8d9dbac69609e75252193e0268e"} Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.617115 4835 scope.go:117] "RemoveContainer" containerID="41be3e5e4a4a9efaef0d63d3e11b3e787f6621e572bb8912680fb5fb4f426bc7" Oct 03 18:26:43 crc kubenswrapper[4835]: E1003 18:26:43.618243 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41be3e5e4a4a9efaef0d63d3e11b3e787f6621e572bb8912680fb5fb4f426bc7\": container with ID starting with 41be3e5e4a4a9efaef0d63d3e11b3e787f6621e572bb8912680fb5fb4f426bc7 not found: ID does not exist" containerID="41be3e5e4a4a9efaef0d63d3e11b3e787f6621e572bb8912680fb5fb4f426bc7" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.618303 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41be3e5e4a4a9efaef0d63d3e11b3e787f6621e572bb8912680fb5fb4f426bc7"} err="failed to get container status \"41be3e5e4a4a9efaef0d63d3e11b3e787f6621e572bb8912680fb5fb4f426bc7\": rpc error: code = NotFound desc = could not find container \"41be3e5e4a4a9efaef0d63d3e11b3e787f6621e572bb8912680fb5fb4f426bc7\": container with ID starting with 41be3e5e4a4a9efaef0d63d3e11b3e787f6621e572bb8912680fb5fb4f426bc7 not found: ID does not exist" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.618324 4835 scope.go:117] "RemoveContainer" containerID="26d88cf13570a96d11c043b5b67c91bf7a957348047ab1d25b9e47af2820fc99" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.628760 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b59d837-ca72-447d-8b77-42675b0ec49b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.628927 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc8c9\" (UniqueName: \"kubernetes.io/projected/6b59d837-ca72-447d-8b77-42675b0ec49b-kube-api-access-tc8c9\") on node \"crc\" DevicePath \"\"" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.628945 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b59d837-ca72-447d-8b77-42675b0ec49b-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.628956 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b59d837-ca72-447d-8b77-42675b0ec49b-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.638245 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4tmrr"] Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.642509 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4tmrr"] Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.644102 4835 scope.go:117] "RemoveContainer" containerID="26d88cf13570a96d11c043b5b67c91bf7a957348047ab1d25b9e47af2820fc99" Oct 03 18:26:43 crc kubenswrapper[4835]: E1003 18:26:43.644589 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d88cf13570a96d11c043b5b67c91bf7a957348047ab1d25b9e47af2820fc99\": container with ID starting with 26d88cf13570a96d11c043b5b67c91bf7a957348047ab1d25b9e47af2820fc99 not found: ID does not exist" containerID="26d88cf13570a96d11c043b5b67c91bf7a957348047ab1d25b9e47af2820fc99" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.644622 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d88cf13570a96d11c043b5b67c91bf7a957348047ab1d25b9e47af2820fc99"} err="failed to get container status \"26d88cf13570a96d11c043b5b67c91bf7a957348047ab1d25b9e47af2820fc99\": rpc error: code = NotFound desc = could not find container \"26d88cf13570a96d11c043b5b67c91bf7a957348047ab1d25b9e47af2820fc99\": container with ID starting with 26d88cf13570a96d11c043b5b67c91bf7a957348047ab1d25b9e47af2820fc99 not found: ID does not exist" Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.647903 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m"] Oct 03 18:26:43 crc kubenswrapper[4835]: I1003 18:26:43.652631 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mr45m"] Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.800016 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f"] Oct 03 18:26:44 crc kubenswrapper[4835]: E1003 18:26:44.800360 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b59d837-ca72-447d-8b77-42675b0ec49b" containerName="route-controller-manager" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.800377 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b59d837-ca72-447d-8b77-42675b0ec49b" containerName="route-controller-manager" Oct 03 18:26:44 crc kubenswrapper[4835]: E1003 18:26:44.800390 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13e2e684-1dc3-4ea7-89a9-05dabb52b7f0" containerName="controller-manager" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.800399 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="13e2e684-1dc3-4ea7-89a9-05dabb52b7f0" containerName="controller-manager" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.800521 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b59d837-ca72-447d-8b77-42675b0ec49b" containerName="route-controller-manager" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.800537 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="13e2e684-1dc3-4ea7-89a9-05dabb52b7f0" containerName="controller-manager" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.801086 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.805599 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.805809 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.805984 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.806217 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.813914 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.816236 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5899d58b64-nwr8m"] Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.816803 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.820733 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.825664 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.825924 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.827348 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.827435 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5899d58b64-nwr8m"] Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.828174 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.828666 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.828819 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.835468 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f"] Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.835792 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.841647 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czcxx\" (UniqueName: \"kubernetes.io/projected/54e40db3-4809-4e15-8fc0-feb76cf95762-kube-api-access-czcxx\") pod \"route-controller-manager-688597f44d-tnn6f\" (UID: \"54e40db3-4809-4e15-8fc0-feb76cf95762\") " pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.841732 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmrqt\" (UniqueName: \"kubernetes.io/projected/e13d146b-0fcf-41d9-bcb0-31d3a444298c-kube-api-access-lmrqt\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.841774 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e13d146b-0fcf-41d9-bcb0-31d3a444298c-client-ca\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.841809 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54e40db3-4809-4e15-8fc0-feb76cf95762-serving-cert\") pod \"route-controller-manager-688597f44d-tnn6f\" (UID: \"54e40db3-4809-4e15-8fc0-feb76cf95762\") " pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.841834 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54e40db3-4809-4e15-8fc0-feb76cf95762-client-ca\") pod \"route-controller-manager-688597f44d-tnn6f\" (UID: \"54e40db3-4809-4e15-8fc0-feb76cf95762\") " pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.841858 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54e40db3-4809-4e15-8fc0-feb76cf95762-config\") pod \"route-controller-manager-688597f44d-tnn6f\" (UID: \"54e40db3-4809-4e15-8fc0-feb76cf95762\") " pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.842007 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e13d146b-0fcf-41d9-bcb0-31d3a444298c-config\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.842035 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e13d146b-0fcf-41d9-bcb0-31d3a444298c-proxy-ca-bundles\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.842113 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e13d146b-0fcf-41d9-bcb0-31d3a444298c-serving-cert\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.883048 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13e2e684-1dc3-4ea7-89a9-05dabb52b7f0" path="/var/lib/kubelet/pods/13e2e684-1dc3-4ea7-89a9-05dabb52b7f0/volumes" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.883599 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b59d837-ca72-447d-8b77-42675b0ec49b" path="/var/lib/kubelet/pods/6b59d837-ca72-447d-8b77-42675b0ec49b/volumes" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.942979 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmrqt\" (UniqueName: \"kubernetes.io/projected/e13d146b-0fcf-41d9-bcb0-31d3a444298c-kube-api-access-lmrqt\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.943030 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e13d146b-0fcf-41d9-bcb0-31d3a444298c-client-ca\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.943295 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54e40db3-4809-4e15-8fc0-feb76cf95762-serving-cert\") pod \"route-controller-manager-688597f44d-tnn6f\" (UID: \"54e40db3-4809-4e15-8fc0-feb76cf95762\") " pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.943338 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54e40db3-4809-4e15-8fc0-feb76cf95762-client-ca\") pod \"route-controller-manager-688597f44d-tnn6f\" (UID: \"54e40db3-4809-4e15-8fc0-feb76cf95762\") " pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.943363 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54e40db3-4809-4e15-8fc0-feb76cf95762-config\") pod \"route-controller-manager-688597f44d-tnn6f\" (UID: \"54e40db3-4809-4e15-8fc0-feb76cf95762\") " pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.943381 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e13d146b-0fcf-41d9-bcb0-31d3a444298c-config\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.943401 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e13d146b-0fcf-41d9-bcb0-31d3a444298c-proxy-ca-bundles\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.943441 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e13d146b-0fcf-41d9-bcb0-31d3a444298c-serving-cert\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.943466 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czcxx\" (UniqueName: \"kubernetes.io/projected/54e40db3-4809-4e15-8fc0-feb76cf95762-kube-api-access-czcxx\") pod \"route-controller-manager-688597f44d-tnn6f\" (UID: \"54e40db3-4809-4e15-8fc0-feb76cf95762\") " pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.944063 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e13d146b-0fcf-41d9-bcb0-31d3a444298c-client-ca\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.944422 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54e40db3-4809-4e15-8fc0-feb76cf95762-client-ca\") pod \"route-controller-manager-688597f44d-tnn6f\" (UID: \"54e40db3-4809-4e15-8fc0-feb76cf95762\") " pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.944599 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54e40db3-4809-4e15-8fc0-feb76cf95762-config\") pod \"route-controller-manager-688597f44d-tnn6f\" (UID: \"54e40db3-4809-4e15-8fc0-feb76cf95762\") " pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.944858 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e13d146b-0fcf-41d9-bcb0-31d3a444298c-config\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.945250 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e13d146b-0fcf-41d9-bcb0-31d3a444298c-proxy-ca-bundles\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.948380 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54e40db3-4809-4e15-8fc0-feb76cf95762-serving-cert\") pod \"route-controller-manager-688597f44d-tnn6f\" (UID: \"54e40db3-4809-4e15-8fc0-feb76cf95762\") " pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.948693 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e13d146b-0fcf-41d9-bcb0-31d3a444298c-serving-cert\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.961731 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmrqt\" (UniqueName: \"kubernetes.io/projected/e13d146b-0fcf-41d9-bcb0-31d3a444298c-kube-api-access-lmrqt\") pod \"controller-manager-5899d58b64-nwr8m\" (UID: \"e13d146b-0fcf-41d9-bcb0-31d3a444298c\") " pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:44 crc kubenswrapper[4835]: I1003 18:26:44.972680 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czcxx\" (UniqueName: \"kubernetes.io/projected/54e40db3-4809-4e15-8fc0-feb76cf95762-kube-api-access-czcxx\") pod \"route-controller-manager-688597f44d-tnn6f\" (UID: \"54e40db3-4809-4e15-8fc0-feb76cf95762\") " pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:45 crc kubenswrapper[4835]: I1003 18:26:45.132665 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:45 crc kubenswrapper[4835]: I1003 18:26:45.147121 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:45 crc kubenswrapper[4835]: I1003 18:26:45.337536 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5899d58b64-nwr8m"] Oct 03 18:26:45 crc kubenswrapper[4835]: I1003 18:26:45.509153 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f"] Oct 03 18:26:45 crc kubenswrapper[4835]: I1003 18:26:45.617816 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" event={"ID":"54e40db3-4809-4e15-8fc0-feb76cf95762","Type":"ContainerStarted","Data":"e804799784db222cd43eab0497efa34a8869827272c4c724fd00898aae60c07e"} Oct 03 18:26:45 crc kubenswrapper[4835]: I1003 18:26:45.619199 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" event={"ID":"e13d146b-0fcf-41d9-bcb0-31d3a444298c","Type":"ContainerStarted","Data":"f5052d7364d7f4faf93b9bd5adae6e560ab649c5c1881e2c736182d2b6a10306"} Oct 03 18:26:45 crc kubenswrapper[4835]: I1003 18:26:45.619251 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" event={"ID":"e13d146b-0fcf-41d9-bcb0-31d3a444298c","Type":"ContainerStarted","Data":"e4f6b58d7bb9dbb459f896eda64a3f949e309c097720e61e53dd2c9314d7bc34"} Oct 03 18:26:45 crc kubenswrapper[4835]: I1003 18:26:45.619410 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:45 crc kubenswrapper[4835]: I1003 18:26:45.623901 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" Oct 03 18:26:45 crc kubenswrapper[4835]: I1003 18:26:45.674701 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5899d58b64-nwr8m" podStartSLOduration=2.6746750969999997 podStartE2EDuration="2.674675097s" podCreationTimestamp="2025-10-03 18:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:26:45.647047957 +0000 UTC m=+747.362988849" watchObservedRunningTime="2025-10-03 18:26:45.674675097 +0000 UTC m=+747.390615969" Oct 03 18:26:46 crc kubenswrapper[4835]: I1003 18:26:46.625448 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" event={"ID":"54e40db3-4809-4e15-8fc0-feb76cf95762","Type":"ContainerStarted","Data":"20dd1ae2dd5bba56b1488a9ab25ab15f9c428f351b9ae5e11f7eb094e3ae6518"} Oct 03 18:26:46 crc kubenswrapper[4835]: I1003 18:26:46.641039 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" podStartSLOduration=3.6410173329999997 podStartE2EDuration="3.641017333s" podCreationTimestamp="2025-10-03 18:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:26:46.640161101 +0000 UTC m=+748.356101983" watchObservedRunningTime="2025-10-03 18:26:46.641017333 +0000 UTC m=+748.356958205" Oct 03 18:26:47 crc kubenswrapper[4835]: I1003 18:26:47.630159 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:47 crc kubenswrapper[4835]: I1003 18:26:47.634659 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-688597f44d-tnn6f" Oct 03 18:26:51 crc kubenswrapper[4835]: I1003 18:26:51.019916 4835 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 18:26:51 crc kubenswrapper[4835]: I1003 18:26:51.774674 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-t9j4g" Oct 03 18:26:54 crc kubenswrapper[4835]: I1003 18:26:54.911058 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rkxz8"] Oct 03 18:26:54 crc kubenswrapper[4835]: I1003 18:26:54.913358 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:26:54 crc kubenswrapper[4835]: I1003 18:26:54.921863 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkxz8"] Oct 03 18:26:55 crc kubenswrapper[4835]: I1003 18:26:55.066519 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ac111c-a675-4f41-a9bd-6a7bc90304a7-utilities\") pod \"redhat-marketplace-rkxz8\" (UID: \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\") " pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:26:55 crc kubenswrapper[4835]: I1003 18:26:55.066764 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ac111c-a675-4f41-a9bd-6a7bc90304a7-catalog-content\") pod \"redhat-marketplace-rkxz8\" (UID: \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\") " pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:26:55 crc kubenswrapper[4835]: I1003 18:26:55.066931 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmx2q\" (UniqueName: \"kubernetes.io/projected/88ac111c-a675-4f41-a9bd-6a7bc90304a7-kube-api-access-jmx2q\") pod \"redhat-marketplace-rkxz8\" (UID: \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\") " pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:26:55 crc kubenswrapper[4835]: I1003 18:26:55.167818 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmx2q\" (UniqueName: \"kubernetes.io/projected/88ac111c-a675-4f41-a9bd-6a7bc90304a7-kube-api-access-jmx2q\") pod \"redhat-marketplace-rkxz8\" (UID: \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\") " pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:26:55 crc kubenswrapper[4835]: I1003 18:26:55.167867 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ac111c-a675-4f41-a9bd-6a7bc90304a7-utilities\") pod \"redhat-marketplace-rkxz8\" (UID: \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\") " pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:26:55 crc kubenswrapper[4835]: I1003 18:26:55.167944 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ac111c-a675-4f41-a9bd-6a7bc90304a7-catalog-content\") pod \"redhat-marketplace-rkxz8\" (UID: \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\") " pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:26:55 crc kubenswrapper[4835]: I1003 18:26:55.168413 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ac111c-a675-4f41-a9bd-6a7bc90304a7-utilities\") pod \"redhat-marketplace-rkxz8\" (UID: \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\") " pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:26:55 crc kubenswrapper[4835]: I1003 18:26:55.168428 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ac111c-a675-4f41-a9bd-6a7bc90304a7-catalog-content\") pod \"redhat-marketplace-rkxz8\" (UID: \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\") " pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:26:55 crc kubenswrapper[4835]: I1003 18:26:55.188883 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmx2q\" (UniqueName: \"kubernetes.io/projected/88ac111c-a675-4f41-a9bd-6a7bc90304a7-kube-api-access-jmx2q\") pod \"redhat-marketplace-rkxz8\" (UID: \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\") " pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:26:55 crc kubenswrapper[4835]: I1003 18:26:55.232379 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:26:55 crc kubenswrapper[4835]: I1003 18:26:55.608876 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkxz8"] Oct 03 18:26:55 crc kubenswrapper[4835]: I1003 18:26:55.670923 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkxz8" event={"ID":"88ac111c-a675-4f41-a9bd-6a7bc90304a7","Type":"ContainerStarted","Data":"4ffa656f919d8b684284c28a2050a69399d76febb97e9b98fc51eeef9be06baf"} Oct 03 18:26:56 crc kubenswrapper[4835]: I1003 18:26:56.679942 4835 generic.go:334] "Generic (PLEG): container finished" podID="88ac111c-a675-4f41-a9bd-6a7bc90304a7" containerID="2b62026aa715070c2d21a6e2a33bf7b5ef54e4e0bce2ea4a21726f946e17d0b1" exitCode=0 Oct 03 18:26:56 crc kubenswrapper[4835]: I1003 18:26:56.680009 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkxz8" event={"ID":"88ac111c-a675-4f41-a9bd-6a7bc90304a7","Type":"ContainerDied","Data":"2b62026aa715070c2d21a6e2a33bf7b5ef54e4e0bce2ea4a21726f946e17d0b1"} Oct 03 18:26:57 crc kubenswrapper[4835]: I1003 18:26:57.690106 4835 generic.go:334] "Generic (PLEG): container finished" podID="88ac111c-a675-4f41-a9bd-6a7bc90304a7" containerID="120fdac88049222c6128164841e8ca3824acec755c36b28946f7931b7729bfc6" exitCode=0 Oct 03 18:26:57 crc kubenswrapper[4835]: I1003 18:26:57.690176 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkxz8" event={"ID":"88ac111c-a675-4f41-a9bd-6a7bc90304a7","Type":"ContainerDied","Data":"120fdac88049222c6128164841e8ca3824acec755c36b28946f7931b7729bfc6"} Oct 03 18:26:58 crc kubenswrapper[4835]: I1003 18:26:58.697093 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkxz8" event={"ID":"88ac111c-a675-4f41-a9bd-6a7bc90304a7","Type":"ContainerStarted","Data":"2af06f8460475ff5668b0dc893bf0f6422d4b9f31fbe3529c335e6191920d953"} Oct 03 18:26:58 crc kubenswrapper[4835]: I1003 18:26:58.719446 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rkxz8" podStartSLOduration=3.323656046 podStartE2EDuration="4.719428184s" podCreationTimestamp="2025-10-03 18:26:54 +0000 UTC" firstStartedPulling="2025-10-03 18:26:56.682378988 +0000 UTC m=+758.398319900" lastFinishedPulling="2025-10-03 18:26:58.078151166 +0000 UTC m=+759.794092038" observedRunningTime="2025-10-03 18:26:58.715465016 +0000 UTC m=+760.431405888" watchObservedRunningTime="2025-10-03 18:26:58.719428184 +0000 UTC m=+760.435369056" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.233348 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.233859 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.281142 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.358541 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.358590 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.358643 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.359189 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6cbefddbf8736040316432cea35633f5b9fb0a39e77bcc8a41c22e4802ea88fe"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.359260 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://6cbefddbf8736040316432cea35633f5b9fb0a39e77bcc8a41c22e4802ea88fe" gracePeriod=600 Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.567146 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p"] Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.568800 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.570436 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.576214 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p"] Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.705241 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjf72\" (UniqueName: \"kubernetes.io/projected/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-kube-api-access-bjf72\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p\" (UID: \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.705356 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p\" (UID: \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.705384 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p\" (UID: \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.732713 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="6cbefddbf8736040316432cea35633f5b9fb0a39e77bcc8a41c22e4802ea88fe" exitCode=0 Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.733014 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"6cbefddbf8736040316432cea35633f5b9fb0a39e77bcc8a41c22e4802ea88fe"} Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.733102 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"fc96018384aa8860a4c2fcec8a03cef5fa41451e8751027f47a38b13cdf1722b"} Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.733138 4835 scope.go:117] "RemoveContainer" containerID="f9f2478d03690f18cde85cd947722003ae9ebb4f9f69a11ddfa8dc6c6d386ff2" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.775440 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.806218 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p\" (UID: \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.806358 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p\" (UID: \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.806403 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjf72\" (UniqueName: \"kubernetes.io/projected/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-kube-api-access-bjf72\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p\" (UID: \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.806625 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p\" (UID: \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.806863 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p\" (UID: \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.824902 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjf72\" (UniqueName: \"kubernetes.io/projected/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-kube-api-access-bjf72\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p\" (UID: \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" Oct 03 18:27:05 crc kubenswrapper[4835]: I1003 18:27:05.897168 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" Oct 03 18:27:06 crc kubenswrapper[4835]: I1003 18:27:06.319831 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p"] Oct 03 18:27:06 crc kubenswrapper[4835]: W1003 18:27:06.326050 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62075311_2baa_4bed_aaf7_df0d5ac3e3f3.slice/crio-e1373b72bb7c0a7ec4f342f25d1820a69fe472b560611f06910d0106c6a044be WatchSource:0}: Error finding container e1373b72bb7c0a7ec4f342f25d1820a69fe472b560611f06910d0106c6a044be: Status 404 returned error can't find the container with id e1373b72bb7c0a7ec4f342f25d1820a69fe472b560611f06910d0106c6a044be Oct 03 18:27:06 crc kubenswrapper[4835]: I1003 18:27:06.674689 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-8nwfg" podUID="300d2397-b9b1-4f44-9eb2-5757940cc64c" containerName="console" containerID="cri-o://121c5ddf04475b188997326eedc59f80a16b5d56cb32de82704fe42cee031f18" gracePeriod=15 Oct 03 18:27:06 crc kubenswrapper[4835]: I1003 18:27:06.739836 4835 generic.go:334] "Generic (PLEG): container finished" podID="62075311-2baa-4bed-aaf7-df0d5ac3e3f3" containerID="67e422d37e2b4b7783ab9a7ccf9a0fb7460c8e358b288e72c789b1d786dc7576" exitCode=0 Oct 03 18:27:06 crc kubenswrapper[4835]: I1003 18:27:06.739895 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" event={"ID":"62075311-2baa-4bed-aaf7-df0d5ac3e3f3","Type":"ContainerDied","Data":"67e422d37e2b4b7783ab9a7ccf9a0fb7460c8e358b288e72c789b1d786dc7576"} Oct 03 18:27:06 crc kubenswrapper[4835]: I1003 18:27:06.739920 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" event={"ID":"62075311-2baa-4bed-aaf7-df0d5ac3e3f3","Type":"ContainerStarted","Data":"e1373b72bb7c0a7ec4f342f25d1820a69fe472b560611f06910d0106c6a044be"} Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.103740 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8nwfg_300d2397-b9b1-4f44-9eb2-5757940cc64c/console/0.log" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.104191 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.229030 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-oauth-serving-cert\") pod \"300d2397-b9b1-4f44-9eb2-5757940cc64c\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.229101 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-trusted-ca-bundle\") pod \"300d2397-b9b1-4f44-9eb2-5757940cc64c\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.229126 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-service-ca\") pod \"300d2397-b9b1-4f44-9eb2-5757940cc64c\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.229165 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-oauth-config\") pod \"300d2397-b9b1-4f44-9eb2-5757940cc64c\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.229223 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-config\") pod \"300d2397-b9b1-4f44-9eb2-5757940cc64c\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.229284 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whqgr\" (UniqueName: \"kubernetes.io/projected/300d2397-b9b1-4f44-9eb2-5757940cc64c-kube-api-access-whqgr\") pod \"300d2397-b9b1-4f44-9eb2-5757940cc64c\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.229314 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-serving-cert\") pod \"300d2397-b9b1-4f44-9eb2-5757940cc64c\" (UID: \"300d2397-b9b1-4f44-9eb2-5757940cc64c\") " Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.230443 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "300d2397-b9b1-4f44-9eb2-5757940cc64c" (UID: "300d2397-b9b1-4f44-9eb2-5757940cc64c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.230460 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "300d2397-b9b1-4f44-9eb2-5757940cc64c" (UID: "300d2397-b9b1-4f44-9eb2-5757940cc64c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.230970 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-config" (OuterVolumeSpecName: "console-config") pod "300d2397-b9b1-4f44-9eb2-5757940cc64c" (UID: "300d2397-b9b1-4f44-9eb2-5757940cc64c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.231233 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-service-ca" (OuterVolumeSpecName: "service-ca") pod "300d2397-b9b1-4f44-9eb2-5757940cc64c" (UID: "300d2397-b9b1-4f44-9eb2-5757940cc64c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.238482 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/300d2397-b9b1-4f44-9eb2-5757940cc64c-kube-api-access-whqgr" (OuterVolumeSpecName: "kube-api-access-whqgr") pod "300d2397-b9b1-4f44-9eb2-5757940cc64c" (UID: "300d2397-b9b1-4f44-9eb2-5757940cc64c"). InnerVolumeSpecName "kube-api-access-whqgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.240269 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "300d2397-b9b1-4f44-9eb2-5757940cc64c" (UID: "300d2397-b9b1-4f44-9eb2-5757940cc64c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.240702 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "300d2397-b9b1-4f44-9eb2-5757940cc64c" (UID: "300d2397-b9b1-4f44-9eb2-5757940cc64c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.331482 4835 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.332379 4835 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.332399 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.332409 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.332418 4835 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.332426 4835 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/300d2397-b9b1-4f44-9eb2-5757940cc64c-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.332435 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whqgr\" (UniqueName: \"kubernetes.io/projected/300d2397-b9b1-4f44-9eb2-5757940cc64c-kube-api-access-whqgr\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.749469 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8nwfg_300d2397-b9b1-4f44-9eb2-5757940cc64c/console/0.log" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.749771 4835 generic.go:334] "Generic (PLEG): container finished" podID="300d2397-b9b1-4f44-9eb2-5757940cc64c" containerID="121c5ddf04475b188997326eedc59f80a16b5d56cb32de82704fe42cee031f18" exitCode=2 Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.749940 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8nwfg" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.750264 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8nwfg" event={"ID":"300d2397-b9b1-4f44-9eb2-5757940cc64c","Type":"ContainerDied","Data":"121c5ddf04475b188997326eedc59f80a16b5d56cb32de82704fe42cee031f18"} Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.750323 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8nwfg" event={"ID":"300d2397-b9b1-4f44-9eb2-5757940cc64c","Type":"ContainerDied","Data":"ced84b48368a0a676d5a7a87e498c5a9ab595c059b9d76ccce4bef1a40fd55f2"} Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.750355 4835 scope.go:117] "RemoveContainer" containerID="121c5ddf04475b188997326eedc59f80a16b5d56cb32de82704fe42cee031f18" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.775765 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8nwfg"] Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.776392 4835 scope.go:117] "RemoveContainer" containerID="121c5ddf04475b188997326eedc59f80a16b5d56cb32de82704fe42cee031f18" Oct 03 18:27:07 crc kubenswrapper[4835]: E1003 18:27:07.776847 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"121c5ddf04475b188997326eedc59f80a16b5d56cb32de82704fe42cee031f18\": container with ID starting with 121c5ddf04475b188997326eedc59f80a16b5d56cb32de82704fe42cee031f18 not found: ID does not exist" containerID="121c5ddf04475b188997326eedc59f80a16b5d56cb32de82704fe42cee031f18" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.776880 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121c5ddf04475b188997326eedc59f80a16b5d56cb32de82704fe42cee031f18"} err="failed to get container status \"121c5ddf04475b188997326eedc59f80a16b5d56cb32de82704fe42cee031f18\": rpc error: code = NotFound desc = could not find container \"121c5ddf04475b188997326eedc59f80a16b5d56cb32de82704fe42cee031f18\": container with ID starting with 121c5ddf04475b188997326eedc59f80a16b5d56cb32de82704fe42cee031f18 not found: ID does not exist" Oct 03 18:27:07 crc kubenswrapper[4835]: I1003 18:27:07.779311 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-8nwfg"] Oct 03 18:27:08 crc kubenswrapper[4835]: I1003 18:27:08.883849 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="300d2397-b9b1-4f44-9eb2-5757940cc64c" path="/var/lib/kubelet/pods/300d2397-b9b1-4f44-9eb2-5757940cc64c/volumes" Oct 03 18:27:08 crc kubenswrapper[4835]: I1003 18:27:08.925681 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-khkwm"] Oct 03 18:27:08 crc kubenswrapper[4835]: E1003 18:27:08.925935 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300d2397-b9b1-4f44-9eb2-5757940cc64c" containerName="console" Oct 03 18:27:08 crc kubenswrapper[4835]: I1003 18:27:08.925945 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="300d2397-b9b1-4f44-9eb2-5757940cc64c" containerName="console" Oct 03 18:27:08 crc kubenswrapper[4835]: I1003 18:27:08.926080 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="300d2397-b9b1-4f44-9eb2-5757940cc64c" containerName="console" Oct 03 18:27:08 crc kubenswrapper[4835]: I1003 18:27:08.927500 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:08 crc kubenswrapper[4835]: I1003 18:27:08.930610 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khkwm"] Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.051992 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9wmx\" (UniqueName: \"kubernetes.io/projected/af659d83-8386-4373-8de1-c754d9b639af-kube-api-access-n9wmx\") pod \"redhat-operators-khkwm\" (UID: \"af659d83-8386-4373-8de1-c754d9b639af\") " pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.052043 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af659d83-8386-4373-8de1-c754d9b639af-utilities\") pod \"redhat-operators-khkwm\" (UID: \"af659d83-8386-4373-8de1-c754d9b639af\") " pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.052095 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af659d83-8386-4373-8de1-c754d9b639af-catalog-content\") pod \"redhat-operators-khkwm\" (UID: \"af659d83-8386-4373-8de1-c754d9b639af\") " pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.154491 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9wmx\" (UniqueName: \"kubernetes.io/projected/af659d83-8386-4373-8de1-c754d9b639af-kube-api-access-n9wmx\") pod \"redhat-operators-khkwm\" (UID: \"af659d83-8386-4373-8de1-c754d9b639af\") " pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.154540 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af659d83-8386-4373-8de1-c754d9b639af-utilities\") pod \"redhat-operators-khkwm\" (UID: \"af659d83-8386-4373-8de1-c754d9b639af\") " pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.154563 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af659d83-8386-4373-8de1-c754d9b639af-catalog-content\") pod \"redhat-operators-khkwm\" (UID: \"af659d83-8386-4373-8de1-c754d9b639af\") " pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.155109 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af659d83-8386-4373-8de1-c754d9b639af-catalog-content\") pod \"redhat-operators-khkwm\" (UID: \"af659d83-8386-4373-8de1-c754d9b639af\") " pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.155381 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af659d83-8386-4373-8de1-c754d9b639af-utilities\") pod \"redhat-operators-khkwm\" (UID: \"af659d83-8386-4373-8de1-c754d9b639af\") " pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.177175 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9wmx\" (UniqueName: \"kubernetes.io/projected/af659d83-8386-4373-8de1-c754d9b639af-kube-api-access-n9wmx\") pod \"redhat-operators-khkwm\" (UID: \"af659d83-8386-4373-8de1-c754d9b639af\") " pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.256315 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.312627 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkxz8"] Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.312862 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rkxz8" podUID="88ac111c-a675-4f41-a9bd-6a7bc90304a7" containerName="registry-server" containerID="cri-o://2af06f8460475ff5668b0dc893bf0f6422d4b9f31fbe3529c335e6191920d953" gracePeriod=2 Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.688657 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-khkwm"] Oct 03 18:27:09 crc kubenswrapper[4835]: W1003 18:27:09.703009 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf659d83_8386_4373_8de1_c754d9b639af.slice/crio-a28bcd44d29e5765e773a3809f27c49a2e0cf514411bbba38583bfd51b646889 WatchSource:0}: Error finding container a28bcd44d29e5765e773a3809f27c49a2e0cf514411bbba38583bfd51b646889: Status 404 returned error can't find the container with id a28bcd44d29e5765e773a3809f27c49a2e0cf514411bbba38583bfd51b646889 Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.737440 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.772331 4835 generic.go:334] "Generic (PLEG): container finished" podID="88ac111c-a675-4f41-a9bd-6a7bc90304a7" containerID="2af06f8460475ff5668b0dc893bf0f6422d4b9f31fbe3529c335e6191920d953" exitCode=0 Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.772385 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkxz8" event={"ID":"88ac111c-a675-4f41-a9bd-6a7bc90304a7","Type":"ContainerDied","Data":"2af06f8460475ff5668b0dc893bf0f6422d4b9f31fbe3529c335e6191920d953"} Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.772413 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rkxz8" event={"ID":"88ac111c-a675-4f41-a9bd-6a7bc90304a7","Type":"ContainerDied","Data":"4ffa656f919d8b684284c28a2050a69399d76febb97e9b98fc51eeef9be06baf"} Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.772430 4835 scope.go:117] "RemoveContainer" containerID="2af06f8460475ff5668b0dc893bf0f6422d4b9f31fbe3529c335e6191920d953" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.772523 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rkxz8" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.774661 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khkwm" event={"ID":"af659d83-8386-4373-8de1-c754d9b639af","Type":"ContainerStarted","Data":"a28bcd44d29e5765e773a3809f27c49a2e0cf514411bbba38583bfd51b646889"} Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.777057 4835 generic.go:334] "Generic (PLEG): container finished" podID="62075311-2baa-4bed-aaf7-df0d5ac3e3f3" containerID="5a7476c380afb52df9a7c594a61bd5176f67c841d20ff9ac2ab1760641391565" exitCode=0 Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.777107 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" event={"ID":"62075311-2baa-4bed-aaf7-df0d5ac3e3f3","Type":"ContainerDied","Data":"5a7476c380afb52df9a7c594a61bd5176f67c841d20ff9ac2ab1760641391565"} Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.805167 4835 scope.go:117] "RemoveContainer" containerID="120fdac88049222c6128164841e8ca3824acec755c36b28946f7931b7729bfc6" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.821586 4835 scope.go:117] "RemoveContainer" containerID="2b62026aa715070c2d21a6e2a33bf7b5ef54e4e0bce2ea4a21726f946e17d0b1" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.857681 4835 scope.go:117] "RemoveContainer" containerID="2af06f8460475ff5668b0dc893bf0f6422d4b9f31fbe3529c335e6191920d953" Oct 03 18:27:09 crc kubenswrapper[4835]: E1003 18:27:09.859514 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2af06f8460475ff5668b0dc893bf0f6422d4b9f31fbe3529c335e6191920d953\": container with ID starting with 2af06f8460475ff5668b0dc893bf0f6422d4b9f31fbe3529c335e6191920d953 not found: ID does not exist" containerID="2af06f8460475ff5668b0dc893bf0f6422d4b9f31fbe3529c335e6191920d953" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.859562 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af06f8460475ff5668b0dc893bf0f6422d4b9f31fbe3529c335e6191920d953"} err="failed to get container status \"2af06f8460475ff5668b0dc893bf0f6422d4b9f31fbe3529c335e6191920d953\": rpc error: code = NotFound desc = could not find container \"2af06f8460475ff5668b0dc893bf0f6422d4b9f31fbe3529c335e6191920d953\": container with ID starting with 2af06f8460475ff5668b0dc893bf0f6422d4b9f31fbe3529c335e6191920d953 not found: ID does not exist" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.859589 4835 scope.go:117] "RemoveContainer" containerID="120fdac88049222c6128164841e8ca3824acec755c36b28946f7931b7729bfc6" Oct 03 18:27:09 crc kubenswrapper[4835]: E1003 18:27:09.859969 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"120fdac88049222c6128164841e8ca3824acec755c36b28946f7931b7729bfc6\": container with ID starting with 120fdac88049222c6128164841e8ca3824acec755c36b28946f7931b7729bfc6 not found: ID does not exist" containerID="120fdac88049222c6128164841e8ca3824acec755c36b28946f7931b7729bfc6" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.860027 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"120fdac88049222c6128164841e8ca3824acec755c36b28946f7931b7729bfc6"} err="failed to get container status \"120fdac88049222c6128164841e8ca3824acec755c36b28946f7931b7729bfc6\": rpc error: code = NotFound desc = could not find container \"120fdac88049222c6128164841e8ca3824acec755c36b28946f7931b7729bfc6\": container with ID starting with 120fdac88049222c6128164841e8ca3824acec755c36b28946f7931b7729bfc6 not found: ID does not exist" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.860091 4835 scope.go:117] "RemoveContainer" containerID="2b62026aa715070c2d21a6e2a33bf7b5ef54e4e0bce2ea4a21726f946e17d0b1" Oct 03 18:27:09 crc kubenswrapper[4835]: E1003 18:27:09.860346 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b62026aa715070c2d21a6e2a33bf7b5ef54e4e0bce2ea4a21726f946e17d0b1\": container with ID starting with 2b62026aa715070c2d21a6e2a33bf7b5ef54e4e0bce2ea4a21726f946e17d0b1 not found: ID does not exist" containerID="2b62026aa715070c2d21a6e2a33bf7b5ef54e4e0bce2ea4a21726f946e17d0b1" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.860369 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b62026aa715070c2d21a6e2a33bf7b5ef54e4e0bce2ea4a21726f946e17d0b1"} err="failed to get container status \"2b62026aa715070c2d21a6e2a33bf7b5ef54e4e0bce2ea4a21726f946e17d0b1\": rpc error: code = NotFound desc = could not find container \"2b62026aa715070c2d21a6e2a33bf7b5ef54e4e0bce2ea4a21726f946e17d0b1\": container with ID starting with 2b62026aa715070c2d21a6e2a33bf7b5ef54e4e0bce2ea4a21726f946e17d0b1 not found: ID does not exist" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.863422 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ac111c-a675-4f41-a9bd-6a7bc90304a7-catalog-content\") pod \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\" (UID: \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\") " Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.863589 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ac111c-a675-4f41-a9bd-6a7bc90304a7-utilities\") pod \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\" (UID: \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\") " Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.863641 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmx2q\" (UniqueName: \"kubernetes.io/projected/88ac111c-a675-4f41-a9bd-6a7bc90304a7-kube-api-access-jmx2q\") pod \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\" (UID: \"88ac111c-a675-4f41-a9bd-6a7bc90304a7\") " Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.865933 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ac111c-a675-4f41-a9bd-6a7bc90304a7-utilities" (OuterVolumeSpecName: "utilities") pod "88ac111c-a675-4f41-a9bd-6a7bc90304a7" (UID: "88ac111c-a675-4f41-a9bd-6a7bc90304a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.869780 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ac111c-a675-4f41-a9bd-6a7bc90304a7-kube-api-access-jmx2q" (OuterVolumeSpecName: "kube-api-access-jmx2q") pod "88ac111c-a675-4f41-a9bd-6a7bc90304a7" (UID: "88ac111c-a675-4f41-a9bd-6a7bc90304a7"). InnerVolumeSpecName "kube-api-access-jmx2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.884548 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ac111c-a675-4f41-a9bd-6a7bc90304a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88ac111c-a675-4f41-a9bd-6a7bc90304a7" (UID: "88ac111c-a675-4f41-a9bd-6a7bc90304a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.964735 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ac111c-a675-4f41-a9bd-6a7bc90304a7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.965175 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ac111c-a675-4f41-a9bd-6a7bc90304a7-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:09 crc kubenswrapper[4835]: I1003 18:27:09.965188 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmx2q\" (UniqueName: \"kubernetes.io/projected/88ac111c-a675-4f41-a9bd-6a7bc90304a7-kube-api-access-jmx2q\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:10 crc kubenswrapper[4835]: I1003 18:27:10.109499 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkxz8"] Oct 03 18:27:10 crc kubenswrapper[4835]: I1003 18:27:10.113449 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rkxz8"] Oct 03 18:27:10 crc kubenswrapper[4835]: I1003 18:27:10.784771 4835 generic.go:334] "Generic (PLEG): container finished" podID="62075311-2baa-4bed-aaf7-df0d5ac3e3f3" containerID="d77806c114ee93934f0c120bed980f162d83581a146cb3b5fc67a99ccdacbdb3" exitCode=0 Oct 03 18:27:10 crc kubenswrapper[4835]: I1003 18:27:10.784834 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" event={"ID":"62075311-2baa-4bed-aaf7-df0d5ac3e3f3","Type":"ContainerDied","Data":"d77806c114ee93934f0c120bed980f162d83581a146cb3b5fc67a99ccdacbdb3"} Oct 03 18:27:10 crc kubenswrapper[4835]: I1003 18:27:10.787149 4835 generic.go:334] "Generic (PLEG): container finished" podID="af659d83-8386-4373-8de1-c754d9b639af" containerID="099038c2fb032ad31abe25894f1eb0083c930a2ae9d92800f1d43f5ccb266ce2" exitCode=0 Oct 03 18:27:10 crc kubenswrapper[4835]: I1003 18:27:10.787175 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khkwm" event={"ID":"af659d83-8386-4373-8de1-c754d9b639af","Type":"ContainerDied","Data":"099038c2fb032ad31abe25894f1eb0083c930a2ae9d92800f1d43f5ccb266ce2"} Oct 03 18:27:10 crc kubenswrapper[4835]: I1003 18:27:10.883330 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ac111c-a675-4f41-a9bd-6a7bc90304a7" path="/var/lib/kubelet/pods/88ac111c-a675-4f41-a9bd-6a7bc90304a7/volumes" Oct 03 18:27:11 crc kubenswrapper[4835]: I1003 18:27:11.793850 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khkwm" event={"ID":"af659d83-8386-4373-8de1-c754d9b639af","Type":"ContainerStarted","Data":"127ea36be0c91bafd225736f85cfc4f17270bafca7afa2089fc61255cfcb0dd5"} Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.114186 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.292152 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-util\") pod \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\" (UID: \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\") " Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.292213 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-bundle\") pod \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\" (UID: \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\") " Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.292287 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjf72\" (UniqueName: \"kubernetes.io/projected/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-kube-api-access-bjf72\") pod \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\" (UID: \"62075311-2baa-4bed-aaf7-df0d5ac3e3f3\") " Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.293368 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-bundle" (OuterVolumeSpecName: "bundle") pod "62075311-2baa-4bed-aaf7-df0d5ac3e3f3" (UID: "62075311-2baa-4bed-aaf7-df0d5ac3e3f3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.298343 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-kube-api-access-bjf72" (OuterVolumeSpecName: "kube-api-access-bjf72") pod "62075311-2baa-4bed-aaf7-df0d5ac3e3f3" (UID: "62075311-2baa-4bed-aaf7-df0d5ac3e3f3"). InnerVolumeSpecName "kube-api-access-bjf72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.302957 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-util" (OuterVolumeSpecName: "util") pod "62075311-2baa-4bed-aaf7-df0d5ac3e3f3" (UID: "62075311-2baa-4bed-aaf7-df0d5ac3e3f3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.393743 4835 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-util\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.393775 4835 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.393784 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjf72\" (UniqueName: \"kubernetes.io/projected/62075311-2baa-4bed-aaf7-df0d5ac3e3f3-kube-api-access-bjf72\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.802236 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" event={"ID":"62075311-2baa-4bed-aaf7-df0d5ac3e3f3","Type":"ContainerDied","Data":"e1373b72bb7c0a7ec4f342f25d1820a69fe472b560611f06910d0106c6a044be"} Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.802287 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1373b72bb7c0a7ec4f342f25d1820a69fe472b560611f06910d0106c6a044be" Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.802259 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p" Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.803963 4835 generic.go:334] "Generic (PLEG): container finished" podID="af659d83-8386-4373-8de1-c754d9b639af" containerID="127ea36be0c91bafd225736f85cfc4f17270bafca7afa2089fc61255cfcb0dd5" exitCode=0 Oct 03 18:27:12 crc kubenswrapper[4835]: I1003 18:27:12.803995 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khkwm" event={"ID":"af659d83-8386-4373-8de1-c754d9b639af","Type":"ContainerDied","Data":"127ea36be0c91bafd225736f85cfc4f17270bafca7afa2089fc61255cfcb0dd5"} Oct 03 18:27:13 crc kubenswrapper[4835]: I1003 18:27:13.811360 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khkwm" event={"ID":"af659d83-8386-4373-8de1-c754d9b639af","Type":"ContainerStarted","Data":"396d74ecc4babb88a267df6645bad0f460ff91ad9e4c0abb2a0200534e161fce"} Oct 03 18:27:13 crc kubenswrapper[4835]: I1003 18:27:13.827219 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-khkwm" podStartSLOduration=3.327762505 podStartE2EDuration="5.827202306s" podCreationTimestamp="2025-10-03 18:27:08 +0000 UTC" firstStartedPulling="2025-10-03 18:27:10.788780019 +0000 UTC m=+772.504720891" lastFinishedPulling="2025-10-03 18:27:13.28821982 +0000 UTC m=+775.004160692" observedRunningTime="2025-10-03 18:27:13.826802435 +0000 UTC m=+775.542743307" watchObservedRunningTime="2025-10-03 18:27:13.827202306 +0000 UTC m=+775.543143178" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.257308 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.257710 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.298048 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.518157 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gttr9"] Oct 03 18:27:19 crc kubenswrapper[4835]: E1003 18:27:19.518636 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62075311-2baa-4bed-aaf7-df0d5ac3e3f3" containerName="pull" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.518714 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="62075311-2baa-4bed-aaf7-df0d5ac3e3f3" containerName="pull" Oct 03 18:27:19 crc kubenswrapper[4835]: E1003 18:27:19.518785 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ac111c-a675-4f41-a9bd-6a7bc90304a7" containerName="registry-server" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.518840 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ac111c-a675-4f41-a9bd-6a7bc90304a7" containerName="registry-server" Oct 03 18:27:19 crc kubenswrapper[4835]: E1003 18:27:19.518897 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ac111c-a675-4f41-a9bd-6a7bc90304a7" containerName="extract-content" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.518978 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ac111c-a675-4f41-a9bd-6a7bc90304a7" containerName="extract-content" Oct 03 18:27:19 crc kubenswrapper[4835]: E1003 18:27:19.519469 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62075311-2baa-4bed-aaf7-df0d5ac3e3f3" containerName="util" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.519531 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="62075311-2baa-4bed-aaf7-df0d5ac3e3f3" containerName="util" Oct 03 18:27:19 crc kubenswrapper[4835]: E1003 18:27:19.519611 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62075311-2baa-4bed-aaf7-df0d5ac3e3f3" containerName="extract" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.519674 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="62075311-2baa-4bed-aaf7-df0d5ac3e3f3" containerName="extract" Oct 03 18:27:19 crc kubenswrapper[4835]: E1003 18:27:19.519733 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ac111c-a675-4f41-a9bd-6a7bc90304a7" containerName="extract-utilities" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.519787 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ac111c-a675-4f41-a9bd-6a7bc90304a7" containerName="extract-utilities" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.519939 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ac111c-a675-4f41-a9bd-6a7bc90304a7" containerName="registry-server" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.520004 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="62075311-2baa-4bed-aaf7-df0d5ac3e3f3" containerName="extract" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.520861 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.527240 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gttr9"] Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.680606 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-utilities\") pod \"certified-operators-gttr9\" (UID: \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\") " pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.680658 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27qw4\" (UniqueName: \"kubernetes.io/projected/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-kube-api-access-27qw4\") pod \"certified-operators-gttr9\" (UID: \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\") " pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.680791 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-catalog-content\") pod \"certified-operators-gttr9\" (UID: \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\") " pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.782136 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-catalog-content\") pod \"certified-operators-gttr9\" (UID: \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\") " pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.782220 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-utilities\") pod \"certified-operators-gttr9\" (UID: \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\") " pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.782261 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27qw4\" (UniqueName: \"kubernetes.io/projected/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-kube-api-access-27qw4\") pod \"certified-operators-gttr9\" (UID: \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\") " pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.783134 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-catalog-content\") pod \"certified-operators-gttr9\" (UID: \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\") " pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.783148 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-utilities\") pod \"certified-operators-gttr9\" (UID: \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\") " pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.805244 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27qw4\" (UniqueName: \"kubernetes.io/projected/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-kube-api-access-27qw4\") pod \"certified-operators-gttr9\" (UID: \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\") " pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.834945 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:19 crc kubenswrapper[4835]: I1003 18:27:19.890915 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:20 crc kubenswrapper[4835]: I1003 18:27:20.273127 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gttr9"] Oct 03 18:27:20 crc kubenswrapper[4835]: W1003 18:27:20.289830 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7966d381_a9c1_42b8_8cb5_39745dd5cbe9.slice/crio-da555bf74af27249dc55c57f517a64e95585864fc4b0e0bd429d07a7261bc524 WatchSource:0}: Error finding container da555bf74af27249dc55c57f517a64e95585864fc4b0e0bd429d07a7261bc524: Status 404 returned error can't find the container with id da555bf74af27249dc55c57f517a64e95585864fc4b0e0bd429d07a7261bc524 Oct 03 18:27:20 crc kubenswrapper[4835]: I1003 18:27:20.848755 4835 generic.go:334] "Generic (PLEG): container finished" podID="7966d381-a9c1-42b8-8cb5-39745dd5cbe9" containerID="c49d268a4d2959871474f1849df09fe50bd1fcef2d2c064574b7d91d8bf94b3c" exitCode=0 Oct 03 18:27:20 crc kubenswrapper[4835]: I1003 18:27:20.848881 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gttr9" event={"ID":"7966d381-a9c1-42b8-8cb5-39745dd5cbe9","Type":"ContainerDied","Data":"c49d268a4d2959871474f1849df09fe50bd1fcef2d2c064574b7d91d8bf94b3c"} Oct 03 18:27:20 crc kubenswrapper[4835]: I1003 18:27:20.848933 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gttr9" event={"ID":"7966d381-a9c1-42b8-8cb5-39745dd5cbe9","Type":"ContainerStarted","Data":"da555bf74af27249dc55c57f517a64e95585864fc4b0e0bd429d07a7261bc524"} Oct 03 18:27:21 crc kubenswrapper[4835]: I1003 18:27:21.788482 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s"] Oct 03 18:27:21 crc kubenswrapper[4835]: I1003 18:27:21.789389 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" Oct 03 18:27:21 crc kubenswrapper[4835]: I1003 18:27:21.794262 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 03 18:27:21 crc kubenswrapper[4835]: I1003 18:27:21.794722 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 03 18:27:21 crc kubenswrapper[4835]: I1003 18:27:21.794800 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 03 18:27:21 crc kubenswrapper[4835]: I1003 18:27:21.794838 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5m49f" Oct 03 18:27:21 crc kubenswrapper[4835]: I1003 18:27:21.794952 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 03 18:27:21 crc kubenswrapper[4835]: I1003 18:27:21.847289 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s"] Oct 03 18:27:21 crc kubenswrapper[4835]: I1003 18:27:21.856359 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gttr9" event={"ID":"7966d381-a9c1-42b8-8cb5-39745dd5cbe9","Type":"ContainerStarted","Data":"bca5b1abead71b5f87e2fd0a5c7a996b60772c6082acfc48398d4c70fa9915d8"} Oct 03 18:27:21 crc kubenswrapper[4835]: I1003 18:27:21.912799 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2bmn\" (UniqueName: \"kubernetes.io/projected/87d605ee-e88a-4b71-9033-029e4ceaf6e5-kube-api-access-f2bmn\") pod \"metallb-operator-controller-manager-549cbc687f-7s58s\" (UID: \"87d605ee-e88a-4b71-9033-029e4ceaf6e5\") " pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" Oct 03 18:27:21 crc kubenswrapper[4835]: I1003 18:27:21.912963 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/87d605ee-e88a-4b71-9033-029e4ceaf6e5-webhook-cert\") pod \"metallb-operator-controller-manager-549cbc687f-7s58s\" (UID: \"87d605ee-e88a-4b71-9033-029e4ceaf6e5\") " pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" Oct 03 18:27:21 crc kubenswrapper[4835]: I1003 18:27:21.913081 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/87d605ee-e88a-4b71-9033-029e4ceaf6e5-apiservice-cert\") pod \"metallb-operator-controller-manager-549cbc687f-7s58s\" (UID: \"87d605ee-e88a-4b71-9033-029e4ceaf6e5\") " pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.014910 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2bmn\" (UniqueName: \"kubernetes.io/projected/87d605ee-e88a-4b71-9033-029e4ceaf6e5-kube-api-access-f2bmn\") pod \"metallb-operator-controller-manager-549cbc687f-7s58s\" (UID: \"87d605ee-e88a-4b71-9033-029e4ceaf6e5\") " pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.014977 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/87d605ee-e88a-4b71-9033-029e4ceaf6e5-webhook-cert\") pod \"metallb-operator-controller-manager-549cbc687f-7s58s\" (UID: \"87d605ee-e88a-4b71-9033-029e4ceaf6e5\") " pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.015026 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/87d605ee-e88a-4b71-9033-029e4ceaf6e5-apiservice-cert\") pod \"metallb-operator-controller-manager-549cbc687f-7s58s\" (UID: \"87d605ee-e88a-4b71-9033-029e4ceaf6e5\") " pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.022506 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx"] Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.026169 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.029937 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/87d605ee-e88a-4b71-9033-029e4ceaf6e5-webhook-cert\") pod \"metallb-operator-controller-manager-549cbc687f-7s58s\" (UID: \"87d605ee-e88a-4b71-9033-029e4ceaf6e5\") " pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.030555 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.030892 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.031106 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-44dpm" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.036372 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/87d605ee-e88a-4b71-9033-029e4ceaf6e5-apiservice-cert\") pod \"metallb-operator-controller-manager-549cbc687f-7s58s\" (UID: \"87d605ee-e88a-4b71-9033-029e4ceaf6e5\") " pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.038341 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx"] Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.046019 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2bmn\" (UniqueName: \"kubernetes.io/projected/87d605ee-e88a-4b71-9033-029e4ceaf6e5-kube-api-access-f2bmn\") pod \"metallb-operator-controller-manager-549cbc687f-7s58s\" (UID: \"87d605ee-e88a-4b71-9033-029e4ceaf6e5\") " pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.103109 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.217833 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvnwj\" (UniqueName: \"kubernetes.io/projected/a33193c6-6de6-466f-bc95-046e6d7ae204-kube-api-access-dvnwj\") pod \"metallb-operator-webhook-server-59cd86bdc9-s28jx\" (UID: \"a33193c6-6de6-466f-bc95-046e6d7ae204\") " pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.217916 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a33193c6-6de6-466f-bc95-046e6d7ae204-apiservice-cert\") pod \"metallb-operator-webhook-server-59cd86bdc9-s28jx\" (UID: \"a33193c6-6de6-466f-bc95-046e6d7ae204\") " pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.218349 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a33193c6-6de6-466f-bc95-046e6d7ae204-webhook-cert\") pod \"metallb-operator-webhook-server-59cd86bdc9-s28jx\" (UID: \"a33193c6-6de6-466f-bc95-046e6d7ae204\") " pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.320111 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a33193c6-6de6-466f-bc95-046e6d7ae204-webhook-cert\") pod \"metallb-operator-webhook-server-59cd86bdc9-s28jx\" (UID: \"a33193c6-6de6-466f-bc95-046e6d7ae204\") " pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.320222 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvnwj\" (UniqueName: \"kubernetes.io/projected/a33193c6-6de6-466f-bc95-046e6d7ae204-kube-api-access-dvnwj\") pod \"metallb-operator-webhook-server-59cd86bdc9-s28jx\" (UID: \"a33193c6-6de6-466f-bc95-046e6d7ae204\") " pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.320268 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a33193c6-6de6-466f-bc95-046e6d7ae204-apiservice-cert\") pod \"metallb-operator-webhook-server-59cd86bdc9-s28jx\" (UID: \"a33193c6-6de6-466f-bc95-046e6d7ae204\") " pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.324490 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a33193c6-6de6-466f-bc95-046e6d7ae204-webhook-cert\") pod \"metallb-operator-webhook-server-59cd86bdc9-s28jx\" (UID: \"a33193c6-6de6-466f-bc95-046e6d7ae204\") " pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.327909 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a33193c6-6de6-466f-bc95-046e6d7ae204-apiservice-cert\") pod \"metallb-operator-webhook-server-59cd86bdc9-s28jx\" (UID: \"a33193c6-6de6-466f-bc95-046e6d7ae204\") " pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.337000 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvnwj\" (UniqueName: \"kubernetes.io/projected/a33193c6-6de6-466f-bc95-046e6d7ae204-kube-api-access-dvnwj\") pod \"metallb-operator-webhook-server-59cd86bdc9-s28jx\" (UID: \"a33193c6-6de6-466f-bc95-046e6d7ae204\") " pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.399863 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.547743 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s"] Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.617305 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx"] Oct 03 18:27:22 crc kubenswrapper[4835]: W1003 18:27:22.632366 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda33193c6_6de6_466f_bc95_046e6d7ae204.slice/crio-2c5e874421ed8ad55a1fbb5b9fb96dc0b483a1fcf3fdfd1503f7b3652ae8cb0f WatchSource:0}: Error finding container 2c5e874421ed8ad55a1fbb5b9fb96dc0b483a1fcf3fdfd1503f7b3652ae8cb0f: Status 404 returned error can't find the container with id 2c5e874421ed8ad55a1fbb5b9fb96dc0b483a1fcf3fdfd1503f7b3652ae8cb0f Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.863297 4835 generic.go:334] "Generic (PLEG): container finished" podID="7966d381-a9c1-42b8-8cb5-39745dd5cbe9" containerID="bca5b1abead71b5f87e2fd0a5c7a996b60772c6082acfc48398d4c70fa9915d8" exitCode=0 Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.863395 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gttr9" event={"ID":"7966d381-a9c1-42b8-8cb5-39745dd5cbe9","Type":"ContainerDied","Data":"bca5b1abead71b5f87e2fd0a5c7a996b60772c6082acfc48398d4c70fa9915d8"} Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.864875 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" event={"ID":"87d605ee-e88a-4b71-9033-029e4ceaf6e5","Type":"ContainerStarted","Data":"1f8547bae9fab352bdd491b69b940d1174780346db1aefd1fa73054c564e264d"} Oct 03 18:27:22 crc kubenswrapper[4835]: I1003 18:27:22.866753 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" event={"ID":"a33193c6-6de6-466f-bc95-046e6d7ae204","Type":"ContainerStarted","Data":"2c5e874421ed8ad55a1fbb5b9fb96dc0b483a1fcf3fdfd1503f7b3652ae8cb0f"} Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.111869 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-khkwm"] Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.112172 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-khkwm" podUID="af659d83-8386-4373-8de1-c754d9b639af" containerName="registry-server" containerID="cri-o://396d74ecc4babb88a267df6645bad0f460ff91ad9e4c0abb2a0200534e161fce" gracePeriod=2 Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.496998 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.635115 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9wmx\" (UniqueName: \"kubernetes.io/projected/af659d83-8386-4373-8de1-c754d9b639af-kube-api-access-n9wmx\") pod \"af659d83-8386-4373-8de1-c754d9b639af\" (UID: \"af659d83-8386-4373-8de1-c754d9b639af\") " Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.635833 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af659d83-8386-4373-8de1-c754d9b639af-catalog-content\") pod \"af659d83-8386-4373-8de1-c754d9b639af\" (UID: \"af659d83-8386-4373-8de1-c754d9b639af\") " Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.635888 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af659d83-8386-4373-8de1-c754d9b639af-utilities\") pod \"af659d83-8386-4373-8de1-c754d9b639af\" (UID: \"af659d83-8386-4373-8de1-c754d9b639af\") " Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.636881 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af659d83-8386-4373-8de1-c754d9b639af-utilities" (OuterVolumeSpecName: "utilities") pod "af659d83-8386-4373-8de1-c754d9b639af" (UID: "af659d83-8386-4373-8de1-c754d9b639af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.640954 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af659d83-8386-4373-8de1-c754d9b639af-kube-api-access-n9wmx" (OuterVolumeSpecName: "kube-api-access-n9wmx") pod "af659d83-8386-4373-8de1-c754d9b639af" (UID: "af659d83-8386-4373-8de1-c754d9b639af"). InnerVolumeSpecName "kube-api-access-n9wmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.711398 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af659d83-8386-4373-8de1-c754d9b639af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af659d83-8386-4373-8de1-c754d9b639af" (UID: "af659d83-8386-4373-8de1-c754d9b639af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.737342 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9wmx\" (UniqueName: \"kubernetes.io/projected/af659d83-8386-4373-8de1-c754d9b639af-kube-api-access-n9wmx\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.737374 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af659d83-8386-4373-8de1-c754d9b639af-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.737383 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af659d83-8386-4373-8de1-c754d9b639af-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.873795 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gttr9" event={"ID":"7966d381-a9c1-42b8-8cb5-39745dd5cbe9","Type":"ContainerStarted","Data":"220f2c3e608afc356e1bbad7691ffe1527226154f66f9b83c6595b1fb8d2ccba"} Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.876117 4835 generic.go:334] "Generic (PLEG): container finished" podID="af659d83-8386-4373-8de1-c754d9b639af" containerID="396d74ecc4babb88a267df6645bad0f460ff91ad9e4c0abb2a0200534e161fce" exitCode=0 Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.876152 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khkwm" event={"ID":"af659d83-8386-4373-8de1-c754d9b639af","Type":"ContainerDied","Data":"396d74ecc4babb88a267df6645bad0f460ff91ad9e4c0abb2a0200534e161fce"} Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.876189 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-khkwm" event={"ID":"af659d83-8386-4373-8de1-c754d9b639af","Type":"ContainerDied","Data":"a28bcd44d29e5765e773a3809f27c49a2e0cf514411bbba38583bfd51b646889"} Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.876194 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-khkwm" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.876206 4835 scope.go:117] "RemoveContainer" containerID="396d74ecc4babb88a267df6645bad0f460ff91ad9e4c0abb2a0200534e161fce" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.889988 4835 scope.go:117] "RemoveContainer" containerID="127ea36be0c91bafd225736f85cfc4f17270bafca7afa2089fc61255cfcb0dd5" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.894693 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gttr9" podStartSLOduration=2.40481576 podStartE2EDuration="4.894675242s" podCreationTimestamp="2025-10-03 18:27:19 +0000 UTC" firstStartedPulling="2025-10-03 18:27:20.850359838 +0000 UTC m=+782.566300710" lastFinishedPulling="2025-10-03 18:27:23.34021932 +0000 UTC m=+785.056160192" observedRunningTime="2025-10-03 18:27:23.893512853 +0000 UTC m=+785.609453725" watchObservedRunningTime="2025-10-03 18:27:23.894675242 +0000 UTC m=+785.610616114" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.909853 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-khkwm"] Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.913303 4835 scope.go:117] "RemoveContainer" containerID="099038c2fb032ad31abe25894f1eb0083c930a2ae9d92800f1d43f5ccb266ce2" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.917171 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-khkwm"] Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.930765 4835 scope.go:117] "RemoveContainer" containerID="396d74ecc4babb88a267df6645bad0f460ff91ad9e4c0abb2a0200534e161fce" Oct 03 18:27:23 crc kubenswrapper[4835]: E1003 18:27:23.931609 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"396d74ecc4babb88a267df6645bad0f460ff91ad9e4c0abb2a0200534e161fce\": container with ID starting with 396d74ecc4babb88a267df6645bad0f460ff91ad9e4c0abb2a0200534e161fce not found: ID does not exist" containerID="396d74ecc4babb88a267df6645bad0f460ff91ad9e4c0abb2a0200534e161fce" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.931649 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"396d74ecc4babb88a267df6645bad0f460ff91ad9e4c0abb2a0200534e161fce"} err="failed to get container status \"396d74ecc4babb88a267df6645bad0f460ff91ad9e4c0abb2a0200534e161fce\": rpc error: code = NotFound desc = could not find container \"396d74ecc4babb88a267df6645bad0f460ff91ad9e4c0abb2a0200534e161fce\": container with ID starting with 396d74ecc4babb88a267df6645bad0f460ff91ad9e4c0abb2a0200534e161fce not found: ID does not exist" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.931678 4835 scope.go:117] "RemoveContainer" containerID="127ea36be0c91bafd225736f85cfc4f17270bafca7afa2089fc61255cfcb0dd5" Oct 03 18:27:23 crc kubenswrapper[4835]: E1003 18:27:23.932116 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"127ea36be0c91bafd225736f85cfc4f17270bafca7afa2089fc61255cfcb0dd5\": container with ID starting with 127ea36be0c91bafd225736f85cfc4f17270bafca7afa2089fc61255cfcb0dd5 not found: ID does not exist" containerID="127ea36be0c91bafd225736f85cfc4f17270bafca7afa2089fc61255cfcb0dd5" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.932139 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"127ea36be0c91bafd225736f85cfc4f17270bafca7afa2089fc61255cfcb0dd5"} err="failed to get container status \"127ea36be0c91bafd225736f85cfc4f17270bafca7afa2089fc61255cfcb0dd5\": rpc error: code = NotFound desc = could not find container \"127ea36be0c91bafd225736f85cfc4f17270bafca7afa2089fc61255cfcb0dd5\": container with ID starting with 127ea36be0c91bafd225736f85cfc4f17270bafca7afa2089fc61255cfcb0dd5 not found: ID does not exist" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.932152 4835 scope.go:117] "RemoveContainer" containerID="099038c2fb032ad31abe25894f1eb0083c930a2ae9d92800f1d43f5ccb266ce2" Oct 03 18:27:23 crc kubenswrapper[4835]: E1003 18:27:23.932555 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"099038c2fb032ad31abe25894f1eb0083c930a2ae9d92800f1d43f5ccb266ce2\": container with ID starting with 099038c2fb032ad31abe25894f1eb0083c930a2ae9d92800f1d43f5ccb266ce2 not found: ID does not exist" containerID="099038c2fb032ad31abe25894f1eb0083c930a2ae9d92800f1d43f5ccb266ce2" Oct 03 18:27:23 crc kubenswrapper[4835]: I1003 18:27:23.932575 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099038c2fb032ad31abe25894f1eb0083c930a2ae9d92800f1d43f5ccb266ce2"} err="failed to get container status \"099038c2fb032ad31abe25894f1eb0083c930a2ae9d92800f1d43f5ccb266ce2\": rpc error: code = NotFound desc = could not find container \"099038c2fb032ad31abe25894f1eb0083c930a2ae9d92800f1d43f5ccb266ce2\": container with ID starting with 099038c2fb032ad31abe25894f1eb0083c930a2ae9d92800f1d43f5ccb266ce2 not found: ID does not exist" Oct 03 18:27:24 crc kubenswrapper[4835]: I1003 18:27:24.892445 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af659d83-8386-4373-8de1-c754d9b639af" path="/var/lib/kubelet/pods/af659d83-8386-4373-8de1-c754d9b639af/volumes" Oct 03 18:27:27 crc kubenswrapper[4835]: I1003 18:27:27.918458 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" event={"ID":"a33193c6-6de6-466f-bc95-046e6d7ae204","Type":"ContainerStarted","Data":"d633a3b5bd3e4fefed91321fb289d6553850c6a97fc33d683817b4271159fe4b"} Oct 03 18:27:27 crc kubenswrapper[4835]: I1003 18:27:27.919626 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" Oct 03 18:27:27 crc kubenswrapper[4835]: I1003 18:27:27.934609 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" podStartSLOduration=1.023901538 podStartE2EDuration="5.934592492s" podCreationTimestamp="2025-10-03 18:27:22 +0000 UTC" firstStartedPulling="2025-10-03 18:27:22.634978357 +0000 UTC m=+784.350919229" lastFinishedPulling="2025-10-03 18:27:27.545669291 +0000 UTC m=+789.261610183" observedRunningTime="2025-10-03 18:27:27.933163526 +0000 UTC m=+789.649104398" watchObservedRunningTime="2025-10-03 18:27:27.934592492 +0000 UTC m=+789.650533364" Oct 03 18:27:29 crc kubenswrapper[4835]: I1003 18:27:29.835613 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:29 crc kubenswrapper[4835]: I1003 18:27:29.835928 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:29 crc kubenswrapper[4835]: I1003 18:27:29.888145 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:29 crc kubenswrapper[4835]: I1003 18:27:29.931448 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" event={"ID":"87d605ee-e88a-4b71-9033-029e4ceaf6e5","Type":"ContainerStarted","Data":"959fa9b97281714f44180b692d8b26bd9a10effaa3d5af9a760969373ffe554d"} Oct 03 18:27:29 crc kubenswrapper[4835]: I1003 18:27:29.956964 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" podStartSLOduration=2.5859800120000003 podStartE2EDuration="8.956947903s" podCreationTimestamp="2025-10-03 18:27:21 +0000 UTC" firstStartedPulling="2025-10-03 18:27:22.567464253 +0000 UTC m=+784.283405125" lastFinishedPulling="2025-10-03 18:27:28.938432144 +0000 UTC m=+790.654373016" observedRunningTime="2025-10-03 18:27:29.95364806 +0000 UTC m=+791.669588932" watchObservedRunningTime="2025-10-03 18:27:29.956947903 +0000 UTC m=+791.672888775" Oct 03 18:27:29 crc kubenswrapper[4835]: I1003 18:27:29.970837 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:30 crc kubenswrapper[4835]: I1003 18:27:30.936044 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.313055 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gttr9"] Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.314257 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gttr9" podUID="7966d381-a9c1-42b8-8cb5-39745dd5cbe9" containerName="registry-server" containerID="cri-o://220f2c3e608afc356e1bbad7691ffe1527226154f66f9b83c6595b1fb8d2ccba" gracePeriod=2 Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.757082 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.873355 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-utilities\") pod \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\" (UID: \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\") " Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.873404 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-catalog-content\") pod \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\" (UID: \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\") " Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.873499 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27qw4\" (UniqueName: \"kubernetes.io/projected/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-kube-api-access-27qw4\") pod \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\" (UID: \"7966d381-a9c1-42b8-8cb5-39745dd5cbe9\") " Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.875489 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-utilities" (OuterVolumeSpecName: "utilities") pod "7966d381-a9c1-42b8-8cb5-39745dd5cbe9" (UID: "7966d381-a9c1-42b8-8cb5-39745dd5cbe9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.889627 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-kube-api-access-27qw4" (OuterVolumeSpecName: "kube-api-access-27qw4") pod "7966d381-a9c1-42b8-8cb5-39745dd5cbe9" (UID: "7966d381-a9c1-42b8-8cb5-39745dd5cbe9"). InnerVolumeSpecName "kube-api-access-27qw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.947564 4835 generic.go:334] "Generic (PLEG): container finished" podID="7966d381-a9c1-42b8-8cb5-39745dd5cbe9" containerID="220f2c3e608afc356e1bbad7691ffe1527226154f66f9b83c6595b1fb8d2ccba" exitCode=0 Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.947603 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gttr9" event={"ID":"7966d381-a9c1-42b8-8cb5-39745dd5cbe9","Type":"ContainerDied","Data":"220f2c3e608afc356e1bbad7691ffe1527226154f66f9b83c6595b1fb8d2ccba"} Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.947626 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gttr9" event={"ID":"7966d381-a9c1-42b8-8cb5-39745dd5cbe9","Type":"ContainerDied","Data":"da555bf74af27249dc55c57f517a64e95585864fc4b0e0bd429d07a7261bc524"} Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.947643 4835 scope.go:117] "RemoveContainer" containerID="220f2c3e608afc356e1bbad7691ffe1527226154f66f9b83c6595b1fb8d2ccba" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.947742 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gttr9" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.965225 4835 scope.go:117] "RemoveContainer" containerID="bca5b1abead71b5f87e2fd0a5c7a996b60772c6082acfc48398d4c70fa9915d8" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.976822 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.976850 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27qw4\" (UniqueName: \"kubernetes.io/projected/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-kube-api-access-27qw4\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.979495 4835 scope.go:117] "RemoveContainer" containerID="c49d268a4d2959871474f1849df09fe50bd1fcef2d2c064574b7d91d8bf94b3c" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.997238 4835 scope.go:117] "RemoveContainer" containerID="220f2c3e608afc356e1bbad7691ffe1527226154f66f9b83c6595b1fb8d2ccba" Oct 03 18:27:32 crc kubenswrapper[4835]: E1003 18:27:32.997700 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220f2c3e608afc356e1bbad7691ffe1527226154f66f9b83c6595b1fb8d2ccba\": container with ID starting with 220f2c3e608afc356e1bbad7691ffe1527226154f66f9b83c6595b1fb8d2ccba not found: ID does not exist" containerID="220f2c3e608afc356e1bbad7691ffe1527226154f66f9b83c6595b1fb8d2ccba" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.997742 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220f2c3e608afc356e1bbad7691ffe1527226154f66f9b83c6595b1fb8d2ccba"} err="failed to get container status \"220f2c3e608afc356e1bbad7691ffe1527226154f66f9b83c6595b1fb8d2ccba\": rpc error: code = NotFound desc = could not find container \"220f2c3e608afc356e1bbad7691ffe1527226154f66f9b83c6595b1fb8d2ccba\": container with ID starting with 220f2c3e608afc356e1bbad7691ffe1527226154f66f9b83c6595b1fb8d2ccba not found: ID does not exist" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.997768 4835 scope.go:117] "RemoveContainer" containerID="bca5b1abead71b5f87e2fd0a5c7a996b60772c6082acfc48398d4c70fa9915d8" Oct 03 18:27:32 crc kubenswrapper[4835]: E1003 18:27:32.998177 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bca5b1abead71b5f87e2fd0a5c7a996b60772c6082acfc48398d4c70fa9915d8\": container with ID starting with bca5b1abead71b5f87e2fd0a5c7a996b60772c6082acfc48398d4c70fa9915d8 not found: ID does not exist" containerID="bca5b1abead71b5f87e2fd0a5c7a996b60772c6082acfc48398d4c70fa9915d8" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.998206 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bca5b1abead71b5f87e2fd0a5c7a996b60772c6082acfc48398d4c70fa9915d8"} err="failed to get container status \"bca5b1abead71b5f87e2fd0a5c7a996b60772c6082acfc48398d4c70fa9915d8\": rpc error: code = NotFound desc = could not find container \"bca5b1abead71b5f87e2fd0a5c7a996b60772c6082acfc48398d4c70fa9915d8\": container with ID starting with bca5b1abead71b5f87e2fd0a5c7a996b60772c6082acfc48398d4c70fa9915d8 not found: ID does not exist" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.998227 4835 scope.go:117] "RemoveContainer" containerID="c49d268a4d2959871474f1849df09fe50bd1fcef2d2c064574b7d91d8bf94b3c" Oct 03 18:27:32 crc kubenswrapper[4835]: E1003 18:27:32.998453 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c49d268a4d2959871474f1849df09fe50bd1fcef2d2c064574b7d91d8bf94b3c\": container with ID starting with c49d268a4d2959871474f1849df09fe50bd1fcef2d2c064574b7d91d8bf94b3c not found: ID does not exist" containerID="c49d268a4d2959871474f1849df09fe50bd1fcef2d2c064574b7d91d8bf94b3c" Oct 03 18:27:32 crc kubenswrapper[4835]: I1003 18:27:32.998472 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c49d268a4d2959871474f1849df09fe50bd1fcef2d2c064574b7d91d8bf94b3c"} err="failed to get container status \"c49d268a4d2959871474f1849df09fe50bd1fcef2d2c064574b7d91d8bf94b3c\": rpc error: code = NotFound desc = could not find container \"c49d268a4d2959871474f1849df09fe50bd1fcef2d2c064574b7d91d8bf94b3c\": container with ID starting with c49d268a4d2959871474f1849df09fe50bd1fcef2d2c064574b7d91d8bf94b3c not found: ID does not exist" Oct 03 18:27:33 crc kubenswrapper[4835]: I1003 18:27:33.356786 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7966d381-a9c1-42b8-8cb5-39745dd5cbe9" (UID: "7966d381-a9c1-42b8-8cb5-39745dd5cbe9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:27:33 crc kubenswrapper[4835]: I1003 18:27:33.381605 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7966d381-a9c1-42b8-8cb5-39745dd5cbe9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:27:33 crc kubenswrapper[4835]: I1003 18:27:33.572717 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gttr9"] Oct 03 18:27:33 crc kubenswrapper[4835]: I1003 18:27:33.577275 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gttr9"] Oct 03 18:27:34 crc kubenswrapper[4835]: I1003 18:27:34.884883 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7966d381-a9c1-42b8-8cb5-39745dd5cbe9" path="/var/lib/kubelet/pods/7966d381-a9c1-42b8-8cb5-39745dd5cbe9/volumes" Oct 03 18:27:42 crc kubenswrapper[4835]: I1003 18:27:42.404582 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-59cd86bdc9-s28jx" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.105669 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-549cbc687f-7s58s" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.121444 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hs68b"] Oct 03 18:28:02 crc kubenswrapper[4835]: E1003 18:28:02.121672 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af659d83-8386-4373-8de1-c754d9b639af" containerName="registry-server" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.121683 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="af659d83-8386-4373-8de1-c754d9b639af" containerName="registry-server" Oct 03 18:28:02 crc kubenswrapper[4835]: E1003 18:28:02.121694 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af659d83-8386-4373-8de1-c754d9b639af" containerName="extract-utilities" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.121701 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="af659d83-8386-4373-8de1-c754d9b639af" containerName="extract-utilities" Oct 03 18:28:02 crc kubenswrapper[4835]: E1003 18:28:02.121719 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af659d83-8386-4373-8de1-c754d9b639af" containerName="extract-content" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.121724 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="af659d83-8386-4373-8de1-c754d9b639af" containerName="extract-content" Oct 03 18:28:02 crc kubenswrapper[4835]: E1003 18:28:02.121732 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7966d381-a9c1-42b8-8cb5-39745dd5cbe9" containerName="extract-utilities" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.121738 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7966d381-a9c1-42b8-8cb5-39745dd5cbe9" containerName="extract-utilities" Oct 03 18:28:02 crc kubenswrapper[4835]: E1003 18:28:02.121745 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7966d381-a9c1-42b8-8cb5-39745dd5cbe9" containerName="extract-content" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.121751 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7966d381-a9c1-42b8-8cb5-39745dd5cbe9" containerName="extract-content" Oct 03 18:28:02 crc kubenswrapper[4835]: E1003 18:28:02.121762 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7966d381-a9c1-42b8-8cb5-39745dd5cbe9" containerName="registry-server" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.121767 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7966d381-a9c1-42b8-8cb5-39745dd5cbe9" containerName="registry-server" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.121859 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7966d381-a9c1-42b8-8cb5-39745dd5cbe9" containerName="registry-server" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.121868 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="af659d83-8386-4373-8de1-c754d9b639af" containerName="registry-server" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.122645 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.146443 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hs68b"] Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.266280 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f431bd1-4150-4921-9a64-0251b2714509-utilities\") pod \"community-operators-hs68b\" (UID: \"3f431bd1-4150-4921-9a64-0251b2714509\") " pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.266430 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbpq7\" (UniqueName: \"kubernetes.io/projected/3f431bd1-4150-4921-9a64-0251b2714509-kube-api-access-mbpq7\") pod \"community-operators-hs68b\" (UID: \"3f431bd1-4150-4921-9a64-0251b2714509\") " pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.266490 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f431bd1-4150-4921-9a64-0251b2714509-catalog-content\") pod \"community-operators-hs68b\" (UID: \"3f431bd1-4150-4921-9a64-0251b2714509\") " pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.367820 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbpq7\" (UniqueName: \"kubernetes.io/projected/3f431bd1-4150-4921-9a64-0251b2714509-kube-api-access-mbpq7\") pod \"community-operators-hs68b\" (UID: \"3f431bd1-4150-4921-9a64-0251b2714509\") " pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.368213 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f431bd1-4150-4921-9a64-0251b2714509-catalog-content\") pod \"community-operators-hs68b\" (UID: \"3f431bd1-4150-4921-9a64-0251b2714509\") " pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.368366 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f431bd1-4150-4921-9a64-0251b2714509-utilities\") pod \"community-operators-hs68b\" (UID: \"3f431bd1-4150-4921-9a64-0251b2714509\") " pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.368682 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f431bd1-4150-4921-9a64-0251b2714509-catalog-content\") pod \"community-operators-hs68b\" (UID: \"3f431bd1-4150-4921-9a64-0251b2714509\") " pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.368771 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f431bd1-4150-4921-9a64-0251b2714509-utilities\") pod \"community-operators-hs68b\" (UID: \"3f431bd1-4150-4921-9a64-0251b2714509\") " pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.392999 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbpq7\" (UniqueName: \"kubernetes.io/projected/3f431bd1-4150-4921-9a64-0251b2714509-kube-api-access-mbpq7\") pod \"community-operators-hs68b\" (UID: \"3f431bd1-4150-4921-9a64-0251b2714509\") " pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.439527 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.897100 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-h2mqk"] Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.899748 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.903382 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.903448 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-wcl4x" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.903475 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7"] Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.903625 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.904101 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.909165 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.912797 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7"] Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.972244 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tcsl9"] Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.973183 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tcsl9" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.976006 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.976599 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.976762 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.977257 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-pg5jm" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.988662 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-7lqwn"] Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.989558 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-7lqwn" Oct 03 18:28:02 crc kubenswrapper[4835]: I1003 18:28:02.993369 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.008271 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-7lqwn"] Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.075431 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-metrics-certs\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.075489 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0b7c13ae-5fc7-4488-ad26-05f62f390a60-metrics\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.075575 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpgpg\" (UniqueName: \"kubernetes.io/projected/75a37aa1-a5db-4a60-b8c3-c06677284925-kube-api-access-kpgpg\") pod \"frr-k8s-webhook-server-64bf5d555-pfgk7\" (UID: \"75a37aa1-a5db-4a60-b8c3-c06677284925\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.076555 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-memberlist\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.076606 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0b7c13ae-5fc7-4488-ad26-05f62f390a60-frr-conf\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.076719 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0b7c13ae-5fc7-4488-ad26-05f62f390a60-frr-sockets\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.076744 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s688\" (UniqueName: \"kubernetes.io/projected/0b7c13ae-5fc7-4488-ad26-05f62f390a60-kube-api-access-5s688\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.076829 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-metallb-excludel2\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.076850 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bsqh\" (UniqueName: \"kubernetes.io/projected/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-kube-api-access-6bsqh\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.076874 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75a37aa1-a5db-4a60-b8c3-c06677284925-cert\") pod \"frr-k8s-webhook-server-64bf5d555-pfgk7\" (UID: \"75a37aa1-a5db-4a60-b8c3-c06677284925\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.076902 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0b7c13ae-5fc7-4488-ad26-05f62f390a60-reloader\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.077419 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b7c13ae-5fc7-4488-ad26-05f62f390a60-metrics-certs\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.077464 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0b7c13ae-5fc7-4488-ad26-05f62f390a60-frr-startup\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.083229 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hs68b"] Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.124017 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs68b" event={"ID":"3f431bd1-4150-4921-9a64-0251b2714509","Type":"ContainerStarted","Data":"6b5f1c3f3c8868396494c989aeda816bdfcfc2c98ef3eff5e8cb117812e309f8"} Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178184 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-metallb-excludel2\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178223 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bsqh\" (UniqueName: \"kubernetes.io/projected/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-kube-api-access-6bsqh\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178265 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75a37aa1-a5db-4a60-b8c3-c06677284925-cert\") pod \"frr-k8s-webhook-server-64bf5d555-pfgk7\" (UID: \"75a37aa1-a5db-4a60-b8c3-c06677284925\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178290 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0b7c13ae-5fc7-4488-ad26-05f62f390a60-reloader\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178313 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc3150d4-d27e-43ed-9659-7034816a3221-cert\") pod \"controller-68d546b9d8-7lqwn\" (UID: \"dc3150d4-d27e-43ed-9659-7034816a3221\") " pod="metallb-system/controller-68d546b9d8-7lqwn" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178341 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b7c13ae-5fc7-4488-ad26-05f62f390a60-metrics-certs\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178358 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0b7c13ae-5fc7-4488-ad26-05f62f390a60-frr-startup\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178393 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-metrics-certs\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178408 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0b7c13ae-5fc7-4488-ad26-05f62f390a60-metrics\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178432 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpgpg\" (UniqueName: \"kubernetes.io/projected/75a37aa1-a5db-4a60-b8c3-c06677284925-kube-api-access-kpgpg\") pod \"frr-k8s-webhook-server-64bf5d555-pfgk7\" (UID: \"75a37aa1-a5db-4a60-b8c3-c06677284925\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178462 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-memberlist\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178478 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0b7c13ae-5fc7-4488-ad26-05f62f390a60-frr-conf\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178496 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc3150d4-d27e-43ed-9659-7034816a3221-metrics-certs\") pod \"controller-68d546b9d8-7lqwn\" (UID: \"dc3150d4-d27e-43ed-9659-7034816a3221\") " pod="metallb-system/controller-68d546b9d8-7lqwn" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178515 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k68p6\" (UniqueName: \"kubernetes.io/projected/dc3150d4-d27e-43ed-9659-7034816a3221-kube-api-access-k68p6\") pod \"controller-68d546b9d8-7lqwn\" (UID: \"dc3150d4-d27e-43ed-9659-7034816a3221\") " pod="metallb-system/controller-68d546b9d8-7lqwn" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178538 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0b7c13ae-5fc7-4488-ad26-05f62f390a60-frr-sockets\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.178553 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s688\" (UniqueName: \"kubernetes.io/projected/0b7c13ae-5fc7-4488-ad26-05f62f390a60-kube-api-access-5s688\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.179027 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0b7c13ae-5fc7-4488-ad26-05f62f390a60-metrics\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.179273 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-metallb-excludel2\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.179282 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0b7c13ae-5fc7-4488-ad26-05f62f390a60-frr-conf\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: E1003 18:28:03.179329 4835 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 03 18:28:03 crc kubenswrapper[4835]: E1003 18:28:03.179371 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b7c13ae-5fc7-4488-ad26-05f62f390a60-metrics-certs podName:0b7c13ae-5fc7-4488-ad26-05f62f390a60 nodeName:}" failed. No retries permitted until 2025-10-03 18:28:03.679356367 +0000 UTC m=+825.395297239 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b7c13ae-5fc7-4488-ad26-05f62f390a60-metrics-certs") pod "frr-k8s-h2mqk" (UID: "0b7c13ae-5fc7-4488-ad26-05f62f390a60") : secret "frr-k8s-certs-secret" not found Oct 03 18:28:03 crc kubenswrapper[4835]: E1003 18:28:03.179384 4835 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 18:28:03 crc kubenswrapper[4835]: E1003 18:28:03.179432 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-memberlist podName:d15b62f6-9ba0-4ed6-a25d-a7629c881a6d nodeName:}" failed. No retries permitted until 2025-10-03 18:28:03.679412698 +0000 UTC m=+825.395353570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-memberlist") pod "speaker-tcsl9" (UID: "d15b62f6-9ba0-4ed6-a25d-a7629c881a6d") : secret "metallb-memberlist" not found Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.180202 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0b7c13ae-5fc7-4488-ad26-05f62f390a60-frr-startup\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.180941 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0b7c13ae-5fc7-4488-ad26-05f62f390a60-frr-sockets\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.181383 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0b7c13ae-5fc7-4488-ad26-05f62f390a60-reloader\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.189934 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-metrics-certs\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.190270 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75a37aa1-a5db-4a60-b8c3-c06677284925-cert\") pod \"frr-k8s-webhook-server-64bf5d555-pfgk7\" (UID: \"75a37aa1-a5db-4a60-b8c3-c06677284925\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.196106 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s688\" (UniqueName: \"kubernetes.io/projected/0b7c13ae-5fc7-4488-ad26-05f62f390a60-kube-api-access-5s688\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.199807 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpgpg\" (UniqueName: \"kubernetes.io/projected/75a37aa1-a5db-4a60-b8c3-c06677284925-kube-api-access-kpgpg\") pod \"frr-k8s-webhook-server-64bf5d555-pfgk7\" (UID: \"75a37aa1-a5db-4a60-b8c3-c06677284925\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.202590 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bsqh\" (UniqueName: \"kubernetes.io/projected/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-kube-api-access-6bsqh\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.232010 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.281151 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc3150d4-d27e-43ed-9659-7034816a3221-metrics-certs\") pod \"controller-68d546b9d8-7lqwn\" (UID: \"dc3150d4-d27e-43ed-9659-7034816a3221\") " pod="metallb-system/controller-68d546b9d8-7lqwn" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.281206 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k68p6\" (UniqueName: \"kubernetes.io/projected/dc3150d4-d27e-43ed-9659-7034816a3221-kube-api-access-k68p6\") pod \"controller-68d546b9d8-7lqwn\" (UID: \"dc3150d4-d27e-43ed-9659-7034816a3221\") " pod="metallb-system/controller-68d546b9d8-7lqwn" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.281258 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc3150d4-d27e-43ed-9659-7034816a3221-cert\") pod \"controller-68d546b9d8-7lqwn\" (UID: \"dc3150d4-d27e-43ed-9659-7034816a3221\") " pod="metallb-system/controller-68d546b9d8-7lqwn" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.286801 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc3150d4-d27e-43ed-9659-7034816a3221-metrics-certs\") pod \"controller-68d546b9d8-7lqwn\" (UID: \"dc3150d4-d27e-43ed-9659-7034816a3221\") " pod="metallb-system/controller-68d546b9d8-7lqwn" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.288335 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc3150d4-d27e-43ed-9659-7034816a3221-cert\") pod \"controller-68d546b9d8-7lqwn\" (UID: \"dc3150d4-d27e-43ed-9659-7034816a3221\") " pod="metallb-system/controller-68d546b9d8-7lqwn" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.297794 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k68p6\" (UniqueName: \"kubernetes.io/projected/dc3150d4-d27e-43ed-9659-7034816a3221-kube-api-access-k68p6\") pod \"controller-68d546b9d8-7lqwn\" (UID: \"dc3150d4-d27e-43ed-9659-7034816a3221\") " pod="metallb-system/controller-68d546b9d8-7lqwn" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.308003 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-7lqwn" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.622658 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7"] Oct 03 18:28:03 crc kubenswrapper[4835]: W1003 18:28:03.628146 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75a37aa1_a5db_4a60_b8c3_c06677284925.slice/crio-512e722426080f70797494764a4471e7060cef7e1f7832e1ff4a9b19b8727d36 WatchSource:0}: Error finding container 512e722426080f70797494764a4471e7060cef7e1f7832e1ff4a9b19b8727d36: Status 404 returned error can't find the container with id 512e722426080f70797494764a4471e7060cef7e1f7832e1ff4a9b19b8727d36 Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.686630 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-memberlist\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.686964 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b7c13ae-5fc7-4488-ad26-05f62f390a60-metrics-certs\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: E1003 18:28:03.686779 4835 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 18:28:03 crc kubenswrapper[4835]: E1003 18:28:03.687046 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-memberlist podName:d15b62f6-9ba0-4ed6-a25d-a7629c881a6d nodeName:}" failed. No retries permitted until 2025-10-03 18:28:04.687032561 +0000 UTC m=+826.402973433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-memberlist") pod "speaker-tcsl9" (UID: "d15b62f6-9ba0-4ed6-a25d-a7629c881a6d") : secret "metallb-memberlist" not found Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.690737 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b7c13ae-5fc7-4488-ad26-05f62f390a60-metrics-certs\") pod \"frr-k8s-h2mqk\" (UID: \"0b7c13ae-5fc7-4488-ad26-05f62f390a60\") " pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.715204 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-7lqwn"] Oct 03 18:28:03 crc kubenswrapper[4835]: W1003 18:28:03.728311 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc3150d4_d27e_43ed_9659_7034816a3221.slice/crio-33e6c93240fb27d0aa2c70bd2ff98b37ede9b583aac935127a70820d44f6bc09 WatchSource:0}: Error finding container 33e6c93240fb27d0aa2c70bd2ff98b37ede9b583aac935127a70820d44f6bc09: Status 404 returned error can't find the container with id 33e6c93240fb27d0aa2c70bd2ff98b37ede9b583aac935127a70820d44f6bc09 Oct 03 18:28:03 crc kubenswrapper[4835]: I1003 18:28:03.826657 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:04 crc kubenswrapper[4835]: I1003 18:28:04.130888 4835 generic.go:334] "Generic (PLEG): container finished" podID="3f431bd1-4150-4921-9a64-0251b2714509" containerID="98be2b52d39993f327fe27e620b948cd12873fbbe6af5d94e9c9f2ea11d67f24" exitCode=0 Oct 03 18:28:04 crc kubenswrapper[4835]: I1003 18:28:04.130951 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs68b" event={"ID":"3f431bd1-4150-4921-9a64-0251b2714509","Type":"ContainerDied","Data":"98be2b52d39993f327fe27e620b948cd12873fbbe6af5d94e9c9f2ea11d67f24"} Oct 03 18:28:04 crc kubenswrapper[4835]: I1003 18:28:04.132177 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2mqk" event={"ID":"0b7c13ae-5fc7-4488-ad26-05f62f390a60","Type":"ContainerStarted","Data":"c2824ef73d302c85e0e7590cd9bcc4409e44bc69d86589f78daa3e4b6c7977b5"} Oct 03 18:28:04 crc kubenswrapper[4835]: I1003 18:28:04.136787 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-7lqwn" event={"ID":"dc3150d4-d27e-43ed-9659-7034816a3221","Type":"ContainerStarted","Data":"50774622b4671f915a63e85cb40a89c9f3893d4872bf5342e2d8577347d09f3e"} Oct 03 18:28:04 crc kubenswrapper[4835]: I1003 18:28:04.136829 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-7lqwn" event={"ID":"dc3150d4-d27e-43ed-9659-7034816a3221","Type":"ContainerStarted","Data":"5c534b1ec5be82daa4714aa245eb0e846b023f576e3a779cf776127f1d30e743"} Oct 03 18:28:04 crc kubenswrapper[4835]: I1003 18:28:04.136843 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-7lqwn" event={"ID":"dc3150d4-d27e-43ed-9659-7034816a3221","Type":"ContainerStarted","Data":"33e6c93240fb27d0aa2c70bd2ff98b37ede9b583aac935127a70820d44f6bc09"} Oct 03 18:28:04 crc kubenswrapper[4835]: I1003 18:28:04.137012 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-7lqwn" Oct 03 18:28:04 crc kubenswrapper[4835]: I1003 18:28:04.138955 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7" event={"ID":"75a37aa1-a5db-4a60-b8c3-c06677284925","Type":"ContainerStarted","Data":"512e722426080f70797494764a4471e7060cef7e1f7832e1ff4a9b19b8727d36"} Oct 03 18:28:04 crc kubenswrapper[4835]: I1003 18:28:04.160729 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-7lqwn" podStartSLOduration=2.160706878 podStartE2EDuration="2.160706878s" podCreationTimestamp="2025-10-03 18:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:28:04.158792539 +0000 UTC m=+825.874733411" watchObservedRunningTime="2025-10-03 18:28:04.160706878 +0000 UTC m=+825.876647750" Oct 03 18:28:04 crc kubenswrapper[4835]: I1003 18:28:04.697719 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-memberlist\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:04 crc kubenswrapper[4835]: I1003 18:28:04.702734 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d15b62f6-9ba0-4ed6-a25d-a7629c881a6d-memberlist\") pod \"speaker-tcsl9\" (UID: \"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d\") " pod="metallb-system/speaker-tcsl9" Oct 03 18:28:04 crc kubenswrapper[4835]: I1003 18:28:04.793943 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tcsl9" Oct 03 18:28:04 crc kubenswrapper[4835]: W1003 18:28:04.812907 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd15b62f6_9ba0_4ed6_a25d_a7629c881a6d.slice/crio-acfb5c8d6f7f6781593633540f9e3d45bb6ef504ddbb47c233011ee00b459020 WatchSource:0}: Error finding container acfb5c8d6f7f6781593633540f9e3d45bb6ef504ddbb47c233011ee00b459020: Status 404 returned error can't find the container with id acfb5c8d6f7f6781593633540f9e3d45bb6ef504ddbb47c233011ee00b459020 Oct 03 18:28:05 crc kubenswrapper[4835]: I1003 18:28:05.189490 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tcsl9" event={"ID":"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d","Type":"ContainerStarted","Data":"1e7ebac7fa68e86f357df0f776493707b7442fd5a859d765259ef2b4a3765b15"} Oct 03 18:28:05 crc kubenswrapper[4835]: I1003 18:28:05.189542 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tcsl9" event={"ID":"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d","Type":"ContainerStarted","Data":"acfb5c8d6f7f6781593633540f9e3d45bb6ef504ddbb47c233011ee00b459020"} Oct 03 18:28:05 crc kubenswrapper[4835]: I1003 18:28:05.209280 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs68b" event={"ID":"3f431bd1-4150-4921-9a64-0251b2714509","Type":"ContainerStarted","Data":"111bcc30184c09298110268b27e58ebbcc12a60b03150f88c9d034b92005188e"} Oct 03 18:28:06 crc kubenswrapper[4835]: I1003 18:28:06.224043 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tcsl9" event={"ID":"d15b62f6-9ba0-4ed6-a25d-a7629c881a6d","Type":"ContainerStarted","Data":"e83a0e71c6f86bba941e87a30d26652138cf64dd16c5f75ba7e7abe36188f31f"} Oct 03 18:28:06 crc kubenswrapper[4835]: I1003 18:28:06.224434 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tcsl9" Oct 03 18:28:06 crc kubenswrapper[4835]: I1003 18:28:06.232174 4835 generic.go:334] "Generic (PLEG): container finished" podID="3f431bd1-4150-4921-9a64-0251b2714509" containerID="111bcc30184c09298110268b27e58ebbcc12a60b03150f88c9d034b92005188e" exitCode=0 Oct 03 18:28:06 crc kubenswrapper[4835]: I1003 18:28:06.232224 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs68b" event={"ID":"3f431bd1-4150-4921-9a64-0251b2714509","Type":"ContainerDied","Data":"111bcc30184c09298110268b27e58ebbcc12a60b03150f88c9d034b92005188e"} Oct 03 18:28:06 crc kubenswrapper[4835]: I1003 18:28:06.248287 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tcsl9" podStartSLOduration=4.248264734 podStartE2EDuration="4.248264734s" podCreationTimestamp="2025-10-03 18:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:28:06.247738931 +0000 UTC m=+827.963679823" watchObservedRunningTime="2025-10-03 18:28:06.248264734 +0000 UTC m=+827.964205606" Oct 03 18:28:07 crc kubenswrapper[4835]: I1003 18:28:07.242294 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs68b" event={"ID":"3f431bd1-4150-4921-9a64-0251b2714509","Type":"ContainerStarted","Data":"d49613d69e3cecf287dbed2cefbe712722f0563fddd483e486c430ad957342f1"} Oct 03 18:28:07 crc kubenswrapper[4835]: I1003 18:28:07.262264 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hs68b" podStartSLOduration=2.727836645 podStartE2EDuration="5.262244569s" podCreationTimestamp="2025-10-03 18:28:02 +0000 UTC" firstStartedPulling="2025-10-03 18:28:04.132117974 +0000 UTC m=+825.848058846" lastFinishedPulling="2025-10-03 18:28:06.666525898 +0000 UTC m=+828.382466770" observedRunningTime="2025-10-03 18:28:07.258886805 +0000 UTC m=+828.974827677" watchObservedRunningTime="2025-10-03 18:28:07.262244569 +0000 UTC m=+828.978185441" Oct 03 18:28:12 crc kubenswrapper[4835]: I1003 18:28:12.274743 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7" event={"ID":"75a37aa1-a5db-4a60-b8c3-c06677284925","Type":"ContainerStarted","Data":"6f93a42ab7c006d66731fb1de6cada166cd76faeac62a29b6b8f4831be6ed40b"} Oct 03 18:28:12 crc kubenswrapper[4835]: I1003 18:28:12.275384 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7" Oct 03 18:28:12 crc kubenswrapper[4835]: I1003 18:28:12.275970 4835 generic.go:334] "Generic (PLEG): container finished" podID="0b7c13ae-5fc7-4488-ad26-05f62f390a60" containerID="879345bad28d1162843c7e70fb81c7fdc5d3b8f6ee935b249ffd6dca362b88a7" exitCode=0 Oct 03 18:28:12 crc kubenswrapper[4835]: I1003 18:28:12.276011 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2mqk" event={"ID":"0b7c13ae-5fc7-4488-ad26-05f62f390a60","Type":"ContainerDied","Data":"879345bad28d1162843c7e70fb81c7fdc5d3b8f6ee935b249ffd6dca362b88a7"} Oct 03 18:28:12 crc kubenswrapper[4835]: I1003 18:28:12.317508 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7" podStartSLOduration=2.615363539 podStartE2EDuration="10.317488368s" podCreationTimestamp="2025-10-03 18:28:02 +0000 UTC" firstStartedPulling="2025-10-03 18:28:03.630255905 +0000 UTC m=+825.346196777" lastFinishedPulling="2025-10-03 18:28:11.332380734 +0000 UTC m=+833.048321606" observedRunningTime="2025-10-03 18:28:12.295782427 +0000 UTC m=+834.011723299" watchObservedRunningTime="2025-10-03 18:28:12.317488368 +0000 UTC m=+834.033429240" Oct 03 18:28:12 crc kubenswrapper[4835]: I1003 18:28:12.440361 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:12 crc kubenswrapper[4835]: I1003 18:28:12.440655 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:12 crc kubenswrapper[4835]: I1003 18:28:12.480975 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:13 crc kubenswrapper[4835]: I1003 18:28:13.316265 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-7lqwn" Oct 03 18:28:13 crc kubenswrapper[4835]: I1003 18:28:13.326475 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:14 crc kubenswrapper[4835]: I1003 18:28:14.288754 4835 generic.go:334] "Generic (PLEG): container finished" podID="0b7c13ae-5fc7-4488-ad26-05f62f390a60" containerID="aea3a926a128081cecf67850bbc0e35aa4e43154e475075359a34574dc3c6293" exitCode=0 Oct 03 18:28:14 crc kubenswrapper[4835]: I1003 18:28:14.288808 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2mqk" event={"ID":"0b7c13ae-5fc7-4488-ad26-05f62f390a60","Type":"ContainerDied","Data":"aea3a926a128081cecf67850bbc0e35aa4e43154e475075359a34574dc3c6293"} Oct 03 18:28:14 crc kubenswrapper[4835]: I1003 18:28:14.713999 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hs68b"] Oct 03 18:28:15 crc kubenswrapper[4835]: I1003 18:28:15.296294 4835 generic.go:334] "Generic (PLEG): container finished" podID="0b7c13ae-5fc7-4488-ad26-05f62f390a60" containerID="49853f9e5903ec0c7c35c7e9c2da734ac461e9a30e887077624b8cbc087a32c8" exitCode=0 Oct 03 18:28:15 crc kubenswrapper[4835]: I1003 18:28:15.296362 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2mqk" event={"ID":"0b7c13ae-5fc7-4488-ad26-05f62f390a60","Type":"ContainerDied","Data":"49853f9e5903ec0c7c35c7e9c2da734ac461e9a30e887077624b8cbc087a32c8"} Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.308683 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2mqk" event={"ID":"0b7c13ae-5fc7-4488-ad26-05f62f390a60","Type":"ContainerStarted","Data":"d4176c74e1301054f7ff49920fa6df6cb666eece4d3c4b1a0ff59ca2aa9bb708"} Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.311451 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2mqk" event={"ID":"0b7c13ae-5fc7-4488-ad26-05f62f390a60","Type":"ContainerStarted","Data":"7d5c0bcf9164b5fd05728815eb396eda62ef178383489249e6f0c0fce3096c29"} Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.311488 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2mqk" event={"ID":"0b7c13ae-5fc7-4488-ad26-05f62f390a60","Type":"ContainerStarted","Data":"a213826e78d8f11bf3c64fcce47f409e9b452afb477cd9181f11db0e97c264de"} Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.308771 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hs68b" podUID="3f431bd1-4150-4921-9a64-0251b2714509" containerName="registry-server" containerID="cri-o://d49613d69e3cecf287dbed2cefbe712722f0563fddd483e486c430ad957342f1" gracePeriod=2 Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.311530 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2mqk" event={"ID":"0b7c13ae-5fc7-4488-ad26-05f62f390a60","Type":"ContainerStarted","Data":"41b361a528806664bb4ebcef1e68d0a714f0e78e8348899f270021319489b08a"} Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.311540 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2mqk" event={"ID":"0b7c13ae-5fc7-4488-ad26-05f62f390a60","Type":"ContainerStarted","Data":"2aaa0e7ce69cfdeb4e26b904b4d6bf43acd0610a56ea8ff7600c13936f2b0dbd"} Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.311551 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h2mqk" event={"ID":"0b7c13ae-5fc7-4488-ad26-05f62f390a60","Type":"ContainerStarted","Data":"f3a5534771ede3de916dc35877410491c73c3598f8a276cec131b0faa6621d86"} Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.335313 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-h2mqk" podStartSLOduration=6.908145678 podStartE2EDuration="14.335295838s" podCreationTimestamp="2025-10-03 18:28:02 +0000 UTC" firstStartedPulling="2025-10-03 18:28:03.924757681 +0000 UTC m=+825.640698553" lastFinishedPulling="2025-10-03 18:28:11.351907851 +0000 UTC m=+833.067848713" observedRunningTime="2025-10-03 18:28:16.331827841 +0000 UTC m=+838.047768753" watchObservedRunningTime="2025-10-03 18:28:16.335295838 +0000 UTC m=+838.051236730" Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.684796 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.841955 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f431bd1-4150-4921-9a64-0251b2714509-catalog-content\") pod \"3f431bd1-4150-4921-9a64-0251b2714509\" (UID: \"3f431bd1-4150-4921-9a64-0251b2714509\") " Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.842092 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbpq7\" (UniqueName: \"kubernetes.io/projected/3f431bd1-4150-4921-9a64-0251b2714509-kube-api-access-mbpq7\") pod \"3f431bd1-4150-4921-9a64-0251b2714509\" (UID: \"3f431bd1-4150-4921-9a64-0251b2714509\") " Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.842204 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f431bd1-4150-4921-9a64-0251b2714509-utilities\") pod \"3f431bd1-4150-4921-9a64-0251b2714509\" (UID: \"3f431bd1-4150-4921-9a64-0251b2714509\") " Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.842935 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f431bd1-4150-4921-9a64-0251b2714509-utilities" (OuterVolumeSpecName: "utilities") pod "3f431bd1-4150-4921-9a64-0251b2714509" (UID: "3f431bd1-4150-4921-9a64-0251b2714509"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.847362 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f431bd1-4150-4921-9a64-0251b2714509-kube-api-access-mbpq7" (OuterVolumeSpecName: "kube-api-access-mbpq7") pod "3f431bd1-4150-4921-9a64-0251b2714509" (UID: "3f431bd1-4150-4921-9a64-0251b2714509"). InnerVolumeSpecName "kube-api-access-mbpq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.884316 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f431bd1-4150-4921-9a64-0251b2714509-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f431bd1-4150-4921-9a64-0251b2714509" (UID: "3f431bd1-4150-4921-9a64-0251b2714509"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.943539 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f431bd1-4150-4921-9a64-0251b2714509-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.943583 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbpq7\" (UniqueName: \"kubernetes.io/projected/3f431bd1-4150-4921-9a64-0251b2714509-kube-api-access-mbpq7\") on node \"crc\" DevicePath \"\"" Oct 03 18:28:16 crc kubenswrapper[4835]: I1003 18:28:16.943600 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f431bd1-4150-4921-9a64-0251b2714509-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.315533 4835 generic.go:334] "Generic (PLEG): container finished" podID="3f431bd1-4150-4921-9a64-0251b2714509" containerID="d49613d69e3cecf287dbed2cefbe712722f0563fddd483e486c430ad957342f1" exitCode=0 Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.315596 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hs68b" Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.315608 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs68b" event={"ID":"3f431bd1-4150-4921-9a64-0251b2714509","Type":"ContainerDied","Data":"d49613d69e3cecf287dbed2cefbe712722f0563fddd483e486c430ad957342f1"} Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.315646 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs68b" event={"ID":"3f431bd1-4150-4921-9a64-0251b2714509","Type":"ContainerDied","Data":"6b5f1c3f3c8868396494c989aeda816bdfcfc2c98ef3eff5e8cb117812e309f8"} Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.315666 4835 scope.go:117] "RemoveContainer" containerID="d49613d69e3cecf287dbed2cefbe712722f0563fddd483e486c430ad957342f1" Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.315972 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.336937 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hs68b"] Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.337403 4835 scope.go:117] "RemoveContainer" containerID="111bcc30184c09298110268b27e58ebbcc12a60b03150f88c9d034b92005188e" Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.340506 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hs68b"] Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.351045 4835 scope.go:117] "RemoveContainer" containerID="98be2b52d39993f327fe27e620b948cd12873fbbe6af5d94e9c9f2ea11d67f24" Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.381229 4835 scope.go:117] "RemoveContainer" containerID="d49613d69e3cecf287dbed2cefbe712722f0563fddd483e486c430ad957342f1" Oct 03 18:28:17 crc kubenswrapper[4835]: E1003 18:28:17.383248 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d49613d69e3cecf287dbed2cefbe712722f0563fddd483e486c430ad957342f1\": container with ID starting with d49613d69e3cecf287dbed2cefbe712722f0563fddd483e486c430ad957342f1 not found: ID does not exist" containerID="d49613d69e3cecf287dbed2cefbe712722f0563fddd483e486c430ad957342f1" Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.383387 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d49613d69e3cecf287dbed2cefbe712722f0563fddd483e486c430ad957342f1"} err="failed to get container status \"d49613d69e3cecf287dbed2cefbe712722f0563fddd483e486c430ad957342f1\": rpc error: code = NotFound desc = could not find container \"d49613d69e3cecf287dbed2cefbe712722f0563fddd483e486c430ad957342f1\": container with ID starting with d49613d69e3cecf287dbed2cefbe712722f0563fddd483e486c430ad957342f1 not found: ID does not exist" Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.383432 4835 scope.go:117] "RemoveContainer" containerID="111bcc30184c09298110268b27e58ebbcc12a60b03150f88c9d034b92005188e" Oct 03 18:28:17 crc kubenswrapper[4835]: E1003 18:28:17.383852 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"111bcc30184c09298110268b27e58ebbcc12a60b03150f88c9d034b92005188e\": container with ID starting with 111bcc30184c09298110268b27e58ebbcc12a60b03150f88c9d034b92005188e not found: ID does not exist" containerID="111bcc30184c09298110268b27e58ebbcc12a60b03150f88c9d034b92005188e" Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.383885 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"111bcc30184c09298110268b27e58ebbcc12a60b03150f88c9d034b92005188e"} err="failed to get container status \"111bcc30184c09298110268b27e58ebbcc12a60b03150f88c9d034b92005188e\": rpc error: code = NotFound desc = could not find container \"111bcc30184c09298110268b27e58ebbcc12a60b03150f88c9d034b92005188e\": container with ID starting with 111bcc30184c09298110268b27e58ebbcc12a60b03150f88c9d034b92005188e not found: ID does not exist" Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.383908 4835 scope.go:117] "RemoveContainer" containerID="98be2b52d39993f327fe27e620b948cd12873fbbe6af5d94e9c9f2ea11d67f24" Oct 03 18:28:17 crc kubenswrapper[4835]: E1003 18:28:17.384097 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98be2b52d39993f327fe27e620b948cd12873fbbe6af5d94e9c9f2ea11d67f24\": container with ID starting with 98be2b52d39993f327fe27e620b948cd12873fbbe6af5d94e9c9f2ea11d67f24 not found: ID does not exist" containerID="98be2b52d39993f327fe27e620b948cd12873fbbe6af5d94e9c9f2ea11d67f24" Oct 03 18:28:17 crc kubenswrapper[4835]: I1003 18:28:17.384120 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98be2b52d39993f327fe27e620b948cd12873fbbe6af5d94e9c9f2ea11d67f24"} err="failed to get container status \"98be2b52d39993f327fe27e620b948cd12873fbbe6af5d94e9c9f2ea11d67f24\": rpc error: code = NotFound desc = could not find container \"98be2b52d39993f327fe27e620b948cd12873fbbe6af5d94e9c9f2ea11d67f24\": container with ID starting with 98be2b52d39993f327fe27e620b948cd12873fbbe6af5d94e9c9f2ea11d67f24 not found: ID does not exist" Oct 03 18:28:18 crc kubenswrapper[4835]: I1003 18:28:18.827325 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:18 crc kubenswrapper[4835]: I1003 18:28:18.866392 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:18 crc kubenswrapper[4835]: I1003 18:28:18.891188 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f431bd1-4150-4921-9a64-0251b2714509" path="/var/lib/kubelet/pods/3f431bd1-4150-4921-9a64-0251b2714509/volumes" Oct 03 18:28:23 crc kubenswrapper[4835]: I1003 18:28:23.238042 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pfgk7" Oct 03 18:28:24 crc kubenswrapper[4835]: I1003 18:28:24.797929 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tcsl9" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.120883 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ct8fh"] Oct 03 18:28:31 crc kubenswrapper[4835]: E1003 18:28:31.121799 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f431bd1-4150-4921-9a64-0251b2714509" containerName="registry-server" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.121837 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f431bd1-4150-4921-9a64-0251b2714509" containerName="registry-server" Oct 03 18:28:31 crc kubenswrapper[4835]: E1003 18:28:31.121851 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f431bd1-4150-4921-9a64-0251b2714509" containerName="extract-utilities" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.121859 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f431bd1-4150-4921-9a64-0251b2714509" containerName="extract-utilities" Oct 03 18:28:31 crc kubenswrapper[4835]: E1003 18:28:31.121873 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f431bd1-4150-4921-9a64-0251b2714509" containerName="extract-content" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.121881 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f431bd1-4150-4921-9a64-0251b2714509" containerName="extract-content" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.122013 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f431bd1-4150-4921-9a64-0251b2714509" containerName="registry-server" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.122414 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ct8fh" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.124593 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.124941 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-m4wkz" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.126588 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.127148 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ct8fh"] Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.233356 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8fdn\" (UniqueName: \"kubernetes.io/projected/ef6029b4-ceb8-498c-9925-74d367072557-kube-api-access-s8fdn\") pod \"openstack-operator-index-ct8fh\" (UID: \"ef6029b4-ceb8-498c-9925-74d367072557\") " pod="openstack-operators/openstack-operator-index-ct8fh" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.334227 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8fdn\" (UniqueName: \"kubernetes.io/projected/ef6029b4-ceb8-498c-9925-74d367072557-kube-api-access-s8fdn\") pod \"openstack-operator-index-ct8fh\" (UID: \"ef6029b4-ceb8-498c-9925-74d367072557\") " pod="openstack-operators/openstack-operator-index-ct8fh" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.359956 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8fdn\" (UniqueName: \"kubernetes.io/projected/ef6029b4-ceb8-498c-9925-74d367072557-kube-api-access-s8fdn\") pod \"openstack-operator-index-ct8fh\" (UID: \"ef6029b4-ceb8-498c-9925-74d367072557\") " pod="openstack-operators/openstack-operator-index-ct8fh" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.441389 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ct8fh" Oct 03 18:28:31 crc kubenswrapper[4835]: I1003 18:28:31.843649 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ct8fh"] Oct 03 18:28:31 crc kubenswrapper[4835]: W1003 18:28:31.851173 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef6029b4_ceb8_498c_9925_74d367072557.slice/crio-bf93cdf8284554d4881041ee1df3effb3c5419bd5fce465b909c79269feabbd1 WatchSource:0}: Error finding container bf93cdf8284554d4881041ee1df3effb3c5419bd5fce465b909c79269feabbd1: Status 404 returned error can't find the container with id bf93cdf8284554d4881041ee1df3effb3c5419bd5fce465b909c79269feabbd1 Oct 03 18:28:32 crc kubenswrapper[4835]: I1003 18:28:32.415679 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ct8fh" event={"ID":"ef6029b4-ceb8-498c-9925-74d367072557","Type":"ContainerStarted","Data":"bf93cdf8284554d4881041ee1df3effb3c5419bd5fce465b909c79269feabbd1"} Oct 03 18:28:33 crc kubenswrapper[4835]: I1003 18:28:33.829947 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-h2mqk" Oct 03 18:28:34 crc kubenswrapper[4835]: I1003 18:28:34.431243 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ct8fh" event={"ID":"ef6029b4-ceb8-498c-9925-74d367072557","Type":"ContainerStarted","Data":"910f3c608b51a7c1f8afa4fbf7838c11dbe949fb27bd7a63686c8d6fc2dd290f"} Oct 03 18:28:34 crc kubenswrapper[4835]: I1003 18:28:34.446053 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ct8fh" podStartSLOduration=1.2633315920000001 podStartE2EDuration="3.446026712s" podCreationTimestamp="2025-10-03 18:28:31 +0000 UTC" firstStartedPulling="2025-10-03 18:28:31.855001447 +0000 UTC m=+853.570942319" lastFinishedPulling="2025-10-03 18:28:34.037696567 +0000 UTC m=+855.753637439" observedRunningTime="2025-10-03 18:28:34.443441258 +0000 UTC m=+856.159382130" watchObservedRunningTime="2025-10-03 18:28:34.446026712 +0000 UTC m=+856.161967584" Oct 03 18:28:41 crc kubenswrapper[4835]: I1003 18:28:41.441729 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ct8fh" Oct 03 18:28:41 crc kubenswrapper[4835]: I1003 18:28:41.442275 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-ct8fh" Oct 03 18:28:41 crc kubenswrapper[4835]: I1003 18:28:41.474688 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-ct8fh" Oct 03 18:28:41 crc kubenswrapper[4835]: I1003 18:28:41.500466 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-ct8fh" Oct 03 18:28:44 crc kubenswrapper[4835]: I1003 18:28:44.745163 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt"] Oct 03 18:28:44 crc kubenswrapper[4835]: I1003 18:28:44.746731 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" Oct 03 18:28:44 crc kubenswrapper[4835]: I1003 18:28:44.748267 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6hq5l" Oct 03 18:28:44 crc kubenswrapper[4835]: I1003 18:28:44.759172 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt"] Oct 03 18:28:44 crc kubenswrapper[4835]: I1003 18:28:44.823497 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbp6b\" (UniqueName: \"kubernetes.io/projected/a719243c-35c2-4ecf-89ff-7843179f36de-kube-api-access-kbp6b\") pod \"76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt\" (UID: \"a719243c-35c2-4ecf-89ff-7843179f36de\") " pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" Oct 03 18:28:44 crc kubenswrapper[4835]: I1003 18:28:44.823557 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a719243c-35c2-4ecf-89ff-7843179f36de-util\") pod \"76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt\" (UID: \"a719243c-35c2-4ecf-89ff-7843179f36de\") " pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" Oct 03 18:28:44 crc kubenswrapper[4835]: I1003 18:28:44.823622 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a719243c-35c2-4ecf-89ff-7843179f36de-bundle\") pod \"76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt\" (UID: \"a719243c-35c2-4ecf-89ff-7843179f36de\") " pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" Oct 03 18:28:44 crc kubenswrapper[4835]: I1003 18:28:44.925364 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbp6b\" (UniqueName: \"kubernetes.io/projected/a719243c-35c2-4ecf-89ff-7843179f36de-kube-api-access-kbp6b\") pod \"76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt\" (UID: \"a719243c-35c2-4ecf-89ff-7843179f36de\") " pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" Oct 03 18:28:44 crc kubenswrapper[4835]: I1003 18:28:44.925431 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a719243c-35c2-4ecf-89ff-7843179f36de-util\") pod \"76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt\" (UID: \"a719243c-35c2-4ecf-89ff-7843179f36de\") " pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" Oct 03 18:28:44 crc kubenswrapper[4835]: I1003 18:28:44.925513 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a719243c-35c2-4ecf-89ff-7843179f36de-bundle\") pod \"76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt\" (UID: \"a719243c-35c2-4ecf-89ff-7843179f36de\") " pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" Oct 03 18:28:44 crc kubenswrapper[4835]: I1003 18:28:44.926116 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a719243c-35c2-4ecf-89ff-7843179f36de-bundle\") pod \"76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt\" (UID: \"a719243c-35c2-4ecf-89ff-7843179f36de\") " pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" Oct 03 18:28:44 crc kubenswrapper[4835]: I1003 18:28:44.926155 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a719243c-35c2-4ecf-89ff-7843179f36de-util\") pod \"76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt\" (UID: \"a719243c-35c2-4ecf-89ff-7843179f36de\") " pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" Oct 03 18:28:44 crc kubenswrapper[4835]: I1003 18:28:44.943890 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbp6b\" (UniqueName: \"kubernetes.io/projected/a719243c-35c2-4ecf-89ff-7843179f36de-kube-api-access-kbp6b\") pod \"76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt\" (UID: \"a719243c-35c2-4ecf-89ff-7843179f36de\") " pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" Oct 03 18:28:45 crc kubenswrapper[4835]: I1003 18:28:45.060388 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" Oct 03 18:28:45 crc kubenswrapper[4835]: I1003 18:28:45.450431 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt"] Oct 03 18:28:45 crc kubenswrapper[4835]: I1003 18:28:45.493023 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" event={"ID":"a719243c-35c2-4ecf-89ff-7843179f36de","Type":"ContainerStarted","Data":"ad2d05c7795936abdb9afcf111d44a49955b0c2346dbfbb3412666ab64f86309"} Oct 03 18:28:46 crc kubenswrapper[4835]: I1003 18:28:46.501122 4835 generic.go:334] "Generic (PLEG): container finished" podID="a719243c-35c2-4ecf-89ff-7843179f36de" containerID="89bbfee42854b6cf37493126d3cc5dec08f25611f76702d8f88320fe5d1f39bb" exitCode=0 Oct 03 18:28:46 crc kubenswrapper[4835]: I1003 18:28:46.501225 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" event={"ID":"a719243c-35c2-4ecf-89ff-7843179f36de","Type":"ContainerDied","Data":"89bbfee42854b6cf37493126d3cc5dec08f25611f76702d8f88320fe5d1f39bb"} Oct 03 18:28:47 crc kubenswrapper[4835]: I1003 18:28:47.509222 4835 generic.go:334] "Generic (PLEG): container finished" podID="a719243c-35c2-4ecf-89ff-7843179f36de" containerID="73ff66464c8e809f54ae793dbf23bfb6e83972a7aca74dccc7112f1044c18fa4" exitCode=0 Oct 03 18:28:47 crc kubenswrapper[4835]: I1003 18:28:47.509580 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" event={"ID":"a719243c-35c2-4ecf-89ff-7843179f36de","Type":"ContainerDied","Data":"73ff66464c8e809f54ae793dbf23bfb6e83972a7aca74dccc7112f1044c18fa4"} Oct 03 18:28:48 crc kubenswrapper[4835]: I1003 18:28:48.517167 4835 generic.go:334] "Generic (PLEG): container finished" podID="a719243c-35c2-4ecf-89ff-7843179f36de" containerID="c879f87594fc37f9a9ff4f1486666683bc9b424c21f43dd1e99f8da158111597" exitCode=0 Oct 03 18:28:48 crc kubenswrapper[4835]: I1003 18:28:48.517218 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" event={"ID":"a719243c-35c2-4ecf-89ff-7843179f36de","Type":"ContainerDied","Data":"c879f87594fc37f9a9ff4f1486666683bc9b424c21f43dd1e99f8da158111597"} Oct 03 18:28:49 crc kubenswrapper[4835]: I1003 18:28:49.747781 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" Oct 03 18:28:49 crc kubenswrapper[4835]: I1003 18:28:49.886403 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbp6b\" (UniqueName: \"kubernetes.io/projected/a719243c-35c2-4ecf-89ff-7843179f36de-kube-api-access-kbp6b\") pod \"a719243c-35c2-4ecf-89ff-7843179f36de\" (UID: \"a719243c-35c2-4ecf-89ff-7843179f36de\") " Oct 03 18:28:49 crc kubenswrapper[4835]: I1003 18:28:49.886520 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a719243c-35c2-4ecf-89ff-7843179f36de-util\") pod \"a719243c-35c2-4ecf-89ff-7843179f36de\" (UID: \"a719243c-35c2-4ecf-89ff-7843179f36de\") " Oct 03 18:28:49 crc kubenswrapper[4835]: I1003 18:28:49.886563 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a719243c-35c2-4ecf-89ff-7843179f36de-bundle\") pod \"a719243c-35c2-4ecf-89ff-7843179f36de\" (UID: \"a719243c-35c2-4ecf-89ff-7843179f36de\") " Oct 03 18:28:49 crc kubenswrapper[4835]: I1003 18:28:49.887375 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a719243c-35c2-4ecf-89ff-7843179f36de-bundle" (OuterVolumeSpecName: "bundle") pod "a719243c-35c2-4ecf-89ff-7843179f36de" (UID: "a719243c-35c2-4ecf-89ff-7843179f36de"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:28:49 crc kubenswrapper[4835]: I1003 18:28:49.891954 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a719243c-35c2-4ecf-89ff-7843179f36de-kube-api-access-kbp6b" (OuterVolumeSpecName: "kube-api-access-kbp6b") pod "a719243c-35c2-4ecf-89ff-7843179f36de" (UID: "a719243c-35c2-4ecf-89ff-7843179f36de"). InnerVolumeSpecName "kube-api-access-kbp6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:28:49 crc kubenswrapper[4835]: I1003 18:28:49.901891 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a719243c-35c2-4ecf-89ff-7843179f36de-util" (OuterVolumeSpecName: "util") pod "a719243c-35c2-4ecf-89ff-7843179f36de" (UID: "a719243c-35c2-4ecf-89ff-7843179f36de"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:28:49 crc kubenswrapper[4835]: I1003 18:28:49.988002 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbp6b\" (UniqueName: \"kubernetes.io/projected/a719243c-35c2-4ecf-89ff-7843179f36de-kube-api-access-kbp6b\") on node \"crc\" DevicePath \"\"" Oct 03 18:28:49 crc kubenswrapper[4835]: I1003 18:28:49.988042 4835 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a719243c-35c2-4ecf-89ff-7843179f36de-util\") on node \"crc\" DevicePath \"\"" Oct 03 18:28:49 crc kubenswrapper[4835]: I1003 18:28:49.988051 4835 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a719243c-35c2-4ecf-89ff-7843179f36de-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:28:50 crc kubenswrapper[4835]: I1003 18:28:50.530609 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" event={"ID":"a719243c-35c2-4ecf-89ff-7843179f36de","Type":"ContainerDied","Data":"ad2d05c7795936abdb9afcf111d44a49955b0c2346dbfbb3412666ab64f86309"} Oct 03 18:28:50 crc kubenswrapper[4835]: I1003 18:28:50.530896 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad2d05c7795936abdb9afcf111d44a49955b0c2346dbfbb3412666ab64f86309" Oct 03 18:28:50 crc kubenswrapper[4835]: I1003 18:28:50.530658 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt" Oct 03 18:28:56 crc kubenswrapper[4835]: I1003 18:28:56.092554 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w"] Oct 03 18:28:56 crc kubenswrapper[4835]: E1003 18:28:56.093108 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a719243c-35c2-4ecf-89ff-7843179f36de" containerName="pull" Oct 03 18:28:56 crc kubenswrapper[4835]: I1003 18:28:56.093120 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a719243c-35c2-4ecf-89ff-7843179f36de" containerName="pull" Oct 03 18:28:56 crc kubenswrapper[4835]: E1003 18:28:56.093132 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a719243c-35c2-4ecf-89ff-7843179f36de" containerName="extract" Oct 03 18:28:56 crc kubenswrapper[4835]: I1003 18:28:56.093137 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a719243c-35c2-4ecf-89ff-7843179f36de" containerName="extract" Oct 03 18:28:56 crc kubenswrapper[4835]: E1003 18:28:56.093147 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a719243c-35c2-4ecf-89ff-7843179f36de" containerName="util" Oct 03 18:28:56 crc kubenswrapper[4835]: I1003 18:28:56.093154 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a719243c-35c2-4ecf-89ff-7843179f36de" containerName="util" Oct 03 18:28:56 crc kubenswrapper[4835]: I1003 18:28:56.093292 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a719243c-35c2-4ecf-89ff-7843179f36de" containerName="extract" Oct 03 18:28:56 crc kubenswrapper[4835]: I1003 18:28:56.093871 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w" Oct 03 18:28:56 crc kubenswrapper[4835]: I1003 18:28:56.097018 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-gmrkf" Oct 03 18:28:56 crc kubenswrapper[4835]: I1003 18:28:56.140552 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w"] Oct 03 18:28:56 crc kubenswrapper[4835]: I1003 18:28:56.164114 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhshw\" (UniqueName: \"kubernetes.io/projected/0369532e-6ba2-4da2-9e1a-c8870d14f001-kube-api-access-xhshw\") pod \"openstack-operator-controller-operator-6479c8db94-bn59w\" (UID: \"0369532e-6ba2-4da2-9e1a-c8870d14f001\") " pod="openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w" Oct 03 18:28:56 crc kubenswrapper[4835]: I1003 18:28:56.265368 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhshw\" (UniqueName: \"kubernetes.io/projected/0369532e-6ba2-4da2-9e1a-c8870d14f001-kube-api-access-xhshw\") pod \"openstack-operator-controller-operator-6479c8db94-bn59w\" (UID: \"0369532e-6ba2-4da2-9e1a-c8870d14f001\") " pod="openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w" Oct 03 18:28:56 crc kubenswrapper[4835]: I1003 18:28:56.292141 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhshw\" (UniqueName: \"kubernetes.io/projected/0369532e-6ba2-4da2-9e1a-c8870d14f001-kube-api-access-xhshw\") pod \"openstack-operator-controller-operator-6479c8db94-bn59w\" (UID: \"0369532e-6ba2-4da2-9e1a-c8870d14f001\") " pod="openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w" Oct 03 18:28:56 crc kubenswrapper[4835]: I1003 18:28:56.408300 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w" Oct 03 18:28:56 crc kubenswrapper[4835]: I1003 18:28:56.840958 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w"] Oct 03 18:28:56 crc kubenswrapper[4835]: W1003 18:28:56.850206 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0369532e_6ba2_4da2_9e1a_c8870d14f001.slice/crio-d9c2b6f199b83fef3c0afe2f56189a6bceead8039af50b14bfd53834e79b9aa7 WatchSource:0}: Error finding container d9c2b6f199b83fef3c0afe2f56189a6bceead8039af50b14bfd53834e79b9aa7: Status 404 returned error can't find the container with id d9c2b6f199b83fef3c0afe2f56189a6bceead8039af50b14bfd53834e79b9aa7 Oct 03 18:28:57 crc kubenswrapper[4835]: I1003 18:28:57.570740 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w" event={"ID":"0369532e-6ba2-4da2-9e1a-c8870d14f001","Type":"ContainerStarted","Data":"d9c2b6f199b83fef3c0afe2f56189a6bceead8039af50b14bfd53834e79b9aa7"} Oct 03 18:29:00 crc kubenswrapper[4835]: I1003 18:29:00.588280 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w" event={"ID":"0369532e-6ba2-4da2-9e1a-c8870d14f001","Type":"ContainerStarted","Data":"cea27614ba26f676d755b544e4c5e16013ad39d2204543ae1588fa94fb89edf7"} Oct 03 18:29:03 crc kubenswrapper[4835]: I1003 18:29:03.609508 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w" event={"ID":"0369532e-6ba2-4da2-9e1a-c8870d14f001","Type":"ContainerStarted","Data":"3d6329cd781e1532bd9db3c6a643f0e02f91f676af7d49cce0a662cd7fea09f2"} Oct 03 18:29:03 crc kubenswrapper[4835]: I1003 18:29:03.610046 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w" Oct 03 18:29:03 crc kubenswrapper[4835]: I1003 18:29:03.641377 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w" podStartSLOduration=2.000033594 podStartE2EDuration="7.641362533s" podCreationTimestamp="2025-10-03 18:28:56 +0000 UTC" firstStartedPulling="2025-10-03 18:28:56.852084728 +0000 UTC m=+878.568025600" lastFinishedPulling="2025-10-03 18:29:02.493413667 +0000 UTC m=+884.209354539" observedRunningTime="2025-10-03 18:29:03.637622041 +0000 UTC m=+885.353562923" watchObservedRunningTime="2025-10-03 18:29:03.641362533 +0000 UTC m=+885.357303395" Oct 03 18:29:05 crc kubenswrapper[4835]: I1003 18:29:05.358901 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:29:05 crc kubenswrapper[4835]: I1003 18:29:05.359819 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:29:06 crc kubenswrapper[4835]: I1003 18:29:06.410515 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6479c8db94-bn59w" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.607862 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.609629 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.611030 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-lwvmq" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.616275 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.618138 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.619921 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-l52gd" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.636518 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.637450 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.638909 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-9pdd6" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.652019 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-5j994"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.653033 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-5j994" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.658815 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.668387 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xs9pp" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.672723 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-5j994"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.677186 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-lpmkz"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.678215 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-lpmkz" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.680929 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-cxm4r" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.683314 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48g7l\" (UniqueName: \"kubernetes.io/projected/52d602a7-7c52-410b-b0d3-7a2233258474-kube-api-access-48g7l\") pod \"cinder-operator-controller-manager-79d68d6c85-h6txd\" (UID: \"52d602a7-7c52-410b-b0d3-7a2233258474\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.683368 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm2q9\" (UniqueName: \"kubernetes.io/projected/0d92cdf1-cce4-485e-9ed5-2539600d7e36-kube-api-access-nm2q9\") pod \"barbican-operator-controller-manager-6c675fb79f-8b5bp\" (UID: \"0d92cdf1-cce4-485e-9ed5-2539600d7e36\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.683443 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqbdz\" (UniqueName: \"kubernetes.io/projected/9a423f22-efb7-4413-b0ba-886fb392aa5c-kube-api-access-mqbdz\") pod \"heat-operator-controller-manager-599898f689-lpmkz\" (UID: \"9a423f22-efb7-4413-b0ba-886fb392aa5c\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-lpmkz" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.683476 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dkzq\" (UniqueName: \"kubernetes.io/projected/f7af64e9-1970-4672-8564-ba96ab371353-kube-api-access-2dkzq\") pod \"designate-operator-controller-manager-75dfd9b554-2w59k\" (UID: \"f7af64e9-1970-4672-8564-ba96ab371353\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.683504 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pktzq\" (UniqueName: \"kubernetes.io/projected/6b594ae9-45e5-4ae6-b59e-8e1e44a182db-kube-api-access-pktzq\") pod \"glance-operator-controller-manager-846dff85b5-5j994\" (UID: \"6b594ae9-45e5-4ae6-b59e-8e1e44a182db\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-5j994" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.685520 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.699135 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.700488 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.714935 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-lpmkz"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.715026 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-wnv5q" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.730545 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.742228 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.749203 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.750434 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.751670 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2jk85" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.753968 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.767537 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.769549 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.770599 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.772654 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-t6s82" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.784511 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pktzq\" (UniqueName: \"kubernetes.io/projected/6b594ae9-45e5-4ae6-b59e-8e1e44a182db-kube-api-access-pktzq\") pod \"glance-operator-controller-manager-846dff85b5-5j994\" (UID: \"6b594ae9-45e5-4ae6-b59e-8e1e44a182db\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-5j994" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.784596 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48g7l\" (UniqueName: \"kubernetes.io/projected/52d602a7-7c52-410b-b0d3-7a2233258474-kube-api-access-48g7l\") pod \"cinder-operator-controller-manager-79d68d6c85-h6txd\" (UID: \"52d602a7-7c52-410b-b0d3-7a2233258474\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.784633 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm2q9\" (UniqueName: \"kubernetes.io/projected/0d92cdf1-cce4-485e-9ed5-2539600d7e36-kube-api-access-nm2q9\") pod \"barbican-operator-controller-manager-6c675fb79f-8b5bp\" (UID: \"0d92cdf1-cce4-485e-9ed5-2539600d7e36\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.784717 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqbdz\" (UniqueName: \"kubernetes.io/projected/9a423f22-efb7-4413-b0ba-886fb392aa5c-kube-api-access-mqbdz\") pod \"heat-operator-controller-manager-599898f689-lpmkz\" (UID: \"9a423f22-efb7-4413-b0ba-886fb392aa5c\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-lpmkz" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.784752 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dkzq\" (UniqueName: \"kubernetes.io/projected/f7af64e9-1970-4672-8564-ba96ab371353-kube-api-access-2dkzq\") pod \"designate-operator-controller-manager-75dfd9b554-2w59k\" (UID: \"f7af64e9-1970-4672-8564-ba96ab371353\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.785224 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.816403 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.831557 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-t4xs2" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.838598 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pktzq\" (UniqueName: \"kubernetes.io/projected/6b594ae9-45e5-4ae6-b59e-8e1e44a182db-kube-api-access-pktzq\") pod \"glance-operator-controller-manager-846dff85b5-5j994\" (UID: \"6b594ae9-45e5-4ae6-b59e-8e1e44a182db\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-5j994" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.840026 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dkzq\" (UniqueName: \"kubernetes.io/projected/f7af64e9-1970-4672-8564-ba96ab371353-kube-api-access-2dkzq\") pod \"designate-operator-controller-manager-75dfd9b554-2w59k\" (UID: \"f7af64e9-1970-4672-8564-ba96ab371353\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.847500 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqbdz\" (UniqueName: \"kubernetes.io/projected/9a423f22-efb7-4413-b0ba-886fb392aa5c-kube-api-access-mqbdz\") pod \"heat-operator-controller-manager-599898f689-lpmkz\" (UID: \"9a423f22-efb7-4413-b0ba-886fb392aa5c\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-lpmkz" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.880966 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.884710 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm2q9\" (UniqueName: \"kubernetes.io/projected/0d92cdf1-cce4-485e-9ed5-2539600d7e36-kube-api-access-nm2q9\") pod \"barbican-operator-controller-manager-6c675fb79f-8b5bp\" (UID: \"0d92cdf1-cce4-485e-9ed5-2539600d7e36\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.890561 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l4xp\" (UniqueName: \"kubernetes.io/projected/75049f02-3a34-4376-b2b2-1c447894b16c-kube-api-access-7l4xp\") pod \"horizon-operator-controller-manager-6769b867d9-dnbkj\" (UID: \"75049f02-3a34-4376-b2b2-1c447894b16c\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.890652 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgtsw\" (UniqueName: \"kubernetes.io/projected/050d6159-92eb-4f65-8c33-dce9d2cac262-kube-api-access-rgtsw\") pod \"infra-operator-controller-manager-5fbf469cd7-5hv6c\" (UID: \"050d6159-92eb-4f65-8c33-dce9d2cac262\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.890712 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jv6r\" (UniqueName: \"kubernetes.io/projected/967b019a-36f2-4dda-8d6f-968cfb65f954-kube-api-access-4jv6r\") pod \"ironic-operator-controller-manager-84bc9db6cc-9x4rx\" (UID: \"967b019a-36f2-4dda-8d6f-968cfb65f954\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.890728 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/050d6159-92eb-4f65-8c33-dce9d2cac262-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-5hv6c\" (UID: \"050d6159-92eb-4f65-8c33-dce9d2cac262\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.915725 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48g7l\" (UniqueName: \"kubernetes.io/projected/52d602a7-7c52-410b-b0d3-7a2233258474-kube-api-access-48g7l\") pod \"cinder-operator-controller-manager-79d68d6c85-h6txd\" (UID: \"52d602a7-7c52-410b-b0d3-7a2233258474\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.928567 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.931138 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.939353 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.941017 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7wfvz" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.946607 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.960574 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.961874 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.962309 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.963924 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-cmtp6" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.972820 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx"] Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.979151 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-5j994" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.991622 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4dzb\" (UniqueName: \"kubernetes.io/projected/3d57a2da-d32b-4fea-8044-da2ab34d87d5-kube-api-access-v4dzb\") pod \"keystone-operator-controller-manager-7f55849f88-6dbbn\" (UID: \"3d57a2da-d32b-4fea-8044-da2ab34d87d5\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.991782 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/050d6159-92eb-4f65-8c33-dce9d2cac262-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-5hv6c\" (UID: \"050d6159-92eb-4f65-8c33-dce9d2cac262\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.992089 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jv6r\" (UniqueName: \"kubernetes.io/projected/967b019a-36f2-4dda-8d6f-968cfb65f954-kube-api-access-4jv6r\") pod \"ironic-operator-controller-manager-84bc9db6cc-9x4rx\" (UID: \"967b019a-36f2-4dda-8d6f-968cfb65f954\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.992241 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l4xp\" (UniqueName: \"kubernetes.io/projected/75049f02-3a34-4376-b2b2-1c447894b16c-kube-api-access-7l4xp\") pod \"horizon-operator-controller-manager-6769b867d9-dnbkj\" (UID: \"75049f02-3a34-4376-b2b2-1c447894b16c\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj" Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.992419 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgtsw\" (UniqueName: \"kubernetes.io/projected/050d6159-92eb-4f65-8c33-dce9d2cac262-kube-api-access-rgtsw\") pod \"infra-operator-controller-manager-5fbf469cd7-5hv6c\" (UID: \"050d6159-92eb-4f65-8c33-dce9d2cac262\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" Oct 03 18:29:22 crc kubenswrapper[4835]: E1003 18:29:22.992253 4835 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 03 18:29:22 crc kubenswrapper[4835]: E1003 18:29:22.992582 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/050d6159-92eb-4f65-8c33-dce9d2cac262-cert podName:050d6159-92eb-4f65-8c33-dce9d2cac262 nodeName:}" failed. No retries permitted until 2025-10-03 18:29:23.492556813 +0000 UTC m=+905.208497685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/050d6159-92eb-4f65-8c33-dce9d2cac262-cert") pod "infra-operator-controller-manager-5fbf469cd7-5hv6c" (UID: "050d6159-92eb-4f65-8c33-dce9d2cac262") : secret "infra-operator-webhook-server-cert" not found Oct 03 18:29:22 crc kubenswrapper[4835]: I1003 18:29:22.992711 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrvdm\" (UniqueName: \"kubernetes.io/projected/c248086e-e0fe-47f0-972e-8378189c018f-kube-api-access-hrvdm\") pod \"manila-operator-controller-manager-6fd6854b49-bxwnx\" (UID: \"c248086e-e0fe-47f0-972e-8378189c018f\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.000456 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-lpmkz" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.002811 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.013375 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.019597 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.022767 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.024406 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jv6r\" (UniqueName: \"kubernetes.io/projected/967b019a-36f2-4dda-8d6f-968cfb65f954-kube-api-access-4jv6r\") pod \"ironic-operator-controller-manager-84bc9db6cc-9x4rx\" (UID: \"967b019a-36f2-4dda-8d6f-968cfb65f954\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.024405 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.028392 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-tbmw9" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.028914 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgtsw\" (UniqueName: \"kubernetes.io/projected/050d6159-92eb-4f65-8c33-dce9d2cac262-kube-api-access-rgtsw\") pod \"infra-operator-controller-manager-5fbf469cd7-5hv6c\" (UID: \"050d6159-92eb-4f65-8c33-dce9d2cac262\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.029789 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.031271 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-dq9x8" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.038187 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.038942 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l4xp\" (UniqueName: \"kubernetes.io/projected/75049f02-3a34-4376-b2b2-1c447894b16c-kube-api-access-7l4xp\") pod \"horizon-operator-controller-manager-6769b867d9-dnbkj\" (UID: \"75049f02-3a34-4376-b2b2-1c447894b16c\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.045303 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.046461 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.048780 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.049265 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-jggnj" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.066142 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.094388 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4dzb\" (UniqueName: \"kubernetes.io/projected/3d57a2da-d32b-4fea-8044-da2ab34d87d5-kube-api-access-v4dzb\") pod \"keystone-operator-controller-manager-7f55849f88-6dbbn\" (UID: \"3d57a2da-d32b-4fea-8044-da2ab34d87d5\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.094431 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mck2\" (UniqueName: \"kubernetes.io/projected/822f37e4-b1f2-40ed-a075-cdff171e42bd-kube-api-access-9mck2\") pod \"mariadb-operator-controller-manager-5c468bf4d4-g5nh8\" (UID: \"822f37e4-b1f2-40ed-a075-cdff171e42bd\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.094486 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf968\" (UniqueName: \"kubernetes.io/projected/54c68b19-b66f-47af-808e-e3708ee36642-kube-api-access-mf968\") pod \"octavia-operator-controller-manager-59d6cfdf45-dmgs4\" (UID: \"54c68b19-b66f-47af-808e-e3708ee36642\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.094519 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cngnj\" (UniqueName: \"kubernetes.io/projected/f2b0de06-963e-4fd4-8056-3272f20f3ef8-kube-api-access-cngnj\") pod \"neutron-operator-controller-manager-6574bf987d-rrqt8\" (UID: \"f2b0de06-963e-4fd4-8056-3272f20f3ef8\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.094557 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6thvx\" (UniqueName: \"kubernetes.io/projected/85ad7a18-4a2c-4225-972b-5e12be17aee0-kube-api-access-6thvx\") pod \"nova-operator-controller-manager-555c7456bd-xrb5b\" (UID: \"85ad7a18-4a2c-4225-972b-5e12be17aee0\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.094598 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrvdm\" (UniqueName: \"kubernetes.io/projected/c248086e-e0fe-47f0-972e-8378189c018f-kube-api-access-hrvdm\") pod \"manila-operator-controller-manager-6fd6854b49-bxwnx\" (UID: \"c248086e-e0fe-47f0-972e-8378189c018f\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.095482 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.096510 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.097784 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.099746 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-8kst4" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.113629 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4dzb\" (UniqueName: \"kubernetes.io/projected/3d57a2da-d32b-4fea-8044-da2ab34d87d5-kube-api-access-v4dzb\") pod \"keystone-operator-controller-manager-7f55849f88-6dbbn\" (UID: \"3d57a2da-d32b-4fea-8044-da2ab34d87d5\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.116548 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrvdm\" (UniqueName: \"kubernetes.io/projected/c248086e-e0fe-47f0-972e-8378189c018f-kube-api-access-hrvdm\") pod \"manila-operator-controller-manager-6fd6854b49-bxwnx\" (UID: \"c248086e-e0fe-47f0-972e-8378189c018f\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.117370 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.118809 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.121133 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-wmd2w" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.124339 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.136175 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.138082 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.143354 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.147336 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-988wr" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.157114 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.172826 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.193841 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.196189 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf968\" (UniqueName: \"kubernetes.io/projected/54c68b19-b66f-47af-808e-e3708ee36642-kube-api-access-mf968\") pod \"octavia-operator-controller-manager-59d6cfdf45-dmgs4\" (UID: \"54c68b19-b66f-47af-808e-e3708ee36642\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.196247 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cngnj\" (UniqueName: \"kubernetes.io/projected/f2b0de06-963e-4fd4-8056-3272f20f3ef8-kube-api-access-cngnj\") pod \"neutron-operator-controller-manager-6574bf987d-rrqt8\" (UID: \"f2b0de06-963e-4fd4-8056-3272f20f3ef8\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.196294 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e96c6565-8432-4b47-bc1a-f7510415a0dd-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t\" (UID: \"e96c6565-8432-4b47-bc1a-f7510415a0dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.196317 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6thvx\" (UniqueName: \"kubernetes.io/projected/85ad7a18-4a2c-4225-972b-5e12be17aee0-kube-api-access-6thvx\") pod \"nova-operator-controller-manager-555c7456bd-xrb5b\" (UID: \"85ad7a18-4a2c-4225-972b-5e12be17aee0\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.196355 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nsw7\" (UniqueName: \"kubernetes.io/projected/11585794-4db5-4e34-aeaa-24036489269b-kube-api-access-9nsw7\") pod \"placement-operator-controller-manager-7d8bb7f44c-xdgkx\" (UID: \"11585794-4db5-4e34-aeaa-24036489269b\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.196402 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mck2\" (UniqueName: \"kubernetes.io/projected/822f37e4-b1f2-40ed-a075-cdff171e42bd-kube-api-access-9mck2\") pod \"mariadb-operator-controller-manager-5c468bf4d4-g5nh8\" (UID: \"822f37e4-b1f2-40ed-a075-cdff171e42bd\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.196421 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42ldp\" (UniqueName: \"kubernetes.io/projected/3c7d1a89-7769-4031-abac-edd6b93bdd30-kube-api-access-42ldp\") pod \"ovn-operator-controller-manager-688db7b6c7-6t4pn\" (UID: \"3c7d1a89-7769-4031-abac-edd6b93bdd30\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.196442 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psjgb\" (UniqueName: \"kubernetes.io/projected/e96c6565-8432-4b47-bc1a-f7510415a0dd-kube-api-access-psjgb\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t\" (UID: \"e96c6565-8432-4b47-bc1a-f7510415a0dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.196666 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.198469 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-577q7" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.216347 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.219959 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf968\" (UniqueName: \"kubernetes.io/projected/54c68b19-b66f-47af-808e-e3708ee36642-kube-api-access-mf968\") pod \"octavia-operator-controller-manager-59d6cfdf45-dmgs4\" (UID: \"54c68b19-b66f-47af-808e-e3708ee36642\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.226276 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.227497 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.232682 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-mxlq9" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.236879 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mck2\" (UniqueName: \"kubernetes.io/projected/822f37e4-b1f2-40ed-a075-cdff171e42bd-kube-api-access-9mck2\") pod \"mariadb-operator-controller-manager-5c468bf4d4-g5nh8\" (UID: \"822f37e4-b1f2-40ed-a075-cdff171e42bd\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.239373 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.250923 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6thvx\" (UniqueName: \"kubernetes.io/projected/85ad7a18-4a2c-4225-972b-5e12be17aee0-kube-api-access-6thvx\") pod \"nova-operator-controller-manager-555c7456bd-xrb5b\" (UID: \"85ad7a18-4a2c-4225-972b-5e12be17aee0\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.257456 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.258200 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cngnj\" (UniqueName: \"kubernetes.io/projected/f2b0de06-963e-4fd4-8056-3272f20f3ef8-kube-api-access-cngnj\") pod \"neutron-operator-controller-manager-6574bf987d-rrqt8\" (UID: \"f2b0de06-963e-4fd4-8056-3272f20f3ef8\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.261205 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.262578 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.266011 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-8f4dc" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.284033 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.298563 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2r9x\" (UniqueName: \"kubernetes.io/projected/cebb3329-2036-4540-9444-d47294d13aff-kube-api-access-f2r9x\") pod \"swift-operator-controller-manager-6859f9b676-fljf9\" (UID: \"cebb3329-2036-4540-9444-d47294d13aff\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.298915 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42ldp\" (UniqueName: \"kubernetes.io/projected/3c7d1a89-7769-4031-abac-edd6b93bdd30-kube-api-access-42ldp\") pod \"ovn-operator-controller-manager-688db7b6c7-6t4pn\" (UID: \"3c7d1a89-7769-4031-abac-edd6b93bdd30\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.298948 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psjgb\" (UniqueName: \"kubernetes.io/projected/e96c6565-8432-4b47-bc1a-f7510415a0dd-kube-api-access-psjgb\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t\" (UID: \"e96c6565-8432-4b47-bc1a-f7510415a0dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.298971 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntl8h\" (UniqueName: \"kubernetes.io/projected/ad1db9d5-da6c-4498-8bb4-176c87481a4d-kube-api-access-ntl8h\") pod \"telemetry-operator-controller-manager-5db5cf686f-bzmxq\" (UID: \"ad1db9d5-da6c-4498-8bb4-176c87481a4d\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.299061 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e96c6565-8432-4b47-bc1a-f7510415a0dd-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t\" (UID: \"e96c6565-8432-4b47-bc1a-f7510415a0dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.299120 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nsw7\" (UniqueName: \"kubernetes.io/projected/11585794-4db5-4e34-aeaa-24036489269b-kube-api-access-9nsw7\") pod \"placement-operator-controller-manager-7d8bb7f44c-xdgkx\" (UID: \"11585794-4db5-4e34-aeaa-24036489269b\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.315964 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nsw7\" (UniqueName: \"kubernetes.io/projected/11585794-4db5-4e34-aeaa-24036489269b-kube-api-access-9nsw7\") pod \"placement-operator-controller-manager-7d8bb7f44c-xdgkx\" (UID: \"11585794-4db5-4e34-aeaa-24036489269b\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" Oct 03 18:29:23 crc kubenswrapper[4835]: E1003 18:29:23.316269 4835 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 18:29:23 crc kubenswrapper[4835]: E1003 18:29:23.316370 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e96c6565-8432-4b47-bc1a-f7510415a0dd-cert podName:e96c6565-8432-4b47-bc1a-f7510415a0dd nodeName:}" failed. No retries permitted until 2025-10-03 18:29:23.816331365 +0000 UTC m=+905.532272237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e96c6565-8432-4b47-bc1a-f7510415a0dd-cert") pod "openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" (UID: "e96c6565-8432-4b47-bc1a-f7510415a0dd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.316850 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.319019 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.320665 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42ldp\" (UniqueName: \"kubernetes.io/projected/3c7d1a89-7769-4031-abac-edd6b93bdd30-kube-api-access-42ldp\") pod \"ovn-operator-controller-manager-688db7b6c7-6t4pn\" (UID: \"3c7d1a89-7769-4031-abac-edd6b93bdd30\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.325762 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.351513 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psjgb\" (UniqueName: \"kubernetes.io/projected/e96c6565-8432-4b47-bc1a-f7510415a0dd-kube-api-access-psjgb\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t\" (UID: \"e96c6565-8432-4b47-bc1a-f7510415a0dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.377683 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.395106 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.399914 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2r9x\" (UniqueName: \"kubernetes.io/projected/cebb3329-2036-4540-9444-d47294d13aff-kube-api-access-f2r9x\") pod \"swift-operator-controller-manager-6859f9b676-fljf9\" (UID: \"cebb3329-2036-4540-9444-d47294d13aff\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.399957 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpt7h\" (UniqueName: \"kubernetes.io/projected/461cd79d-16ba-4619-b2f0-1e4611a092e1-kube-api-access-lpt7h\") pod \"test-operator-controller-manager-5cd5cb47d7-6g488\" (UID: \"461cd79d-16ba-4619-b2f0-1e4611a092e1\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.399983 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntl8h\" (UniqueName: \"kubernetes.io/projected/ad1db9d5-da6c-4498-8bb4-176c87481a4d-kube-api-access-ntl8h\") pod \"telemetry-operator-controller-manager-5db5cf686f-bzmxq\" (UID: \"ad1db9d5-da6c-4498-8bb4-176c87481a4d\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.401343 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.403960 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.410830 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-xfswj" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.419419 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntl8h\" (UniqueName: \"kubernetes.io/projected/ad1db9d5-da6c-4498-8bb4-176c87481a4d-kube-api-access-ntl8h\") pod \"telemetry-operator-controller-manager-5db5cf686f-bzmxq\" (UID: \"ad1db9d5-da6c-4498-8bb4-176c87481a4d\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.419528 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.423003 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2r9x\" (UniqueName: \"kubernetes.io/projected/cebb3329-2036-4540-9444-d47294d13aff-kube-api-access-f2r9x\") pod \"swift-operator-controller-manager-6859f9b676-fljf9\" (UID: \"cebb3329-2036-4540-9444-d47294d13aff\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.443770 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.445022 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.449790 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-m8wqx" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.449922 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.454874 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.459339 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.487741 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.488915 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.492397 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-ffmgb" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.501409 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpt7h\" (UniqueName: \"kubernetes.io/projected/461cd79d-16ba-4619-b2f0-1e4611a092e1-kube-api-access-lpt7h\") pod \"test-operator-controller-manager-5cd5cb47d7-6g488\" (UID: \"461cd79d-16ba-4619-b2f0-1e4611a092e1\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.501458 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v94tn\" (UniqueName: \"kubernetes.io/projected/c997309f-800d-4180-aaa9-2594faef74ee-kube-api-access-v94tn\") pod \"openstack-operator-controller-manager-5b9c56875d-f9xpd\" (UID: \"c997309f-800d-4180-aaa9-2594faef74ee\") " pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.501480 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c997309f-800d-4180-aaa9-2594faef74ee-cert\") pod \"openstack-operator-controller-manager-5b9c56875d-f9xpd\" (UID: \"c997309f-800d-4180-aaa9-2594faef74ee\") " pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.501509 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/050d6159-92eb-4f65-8c33-dce9d2cac262-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-5hv6c\" (UID: \"050d6159-92eb-4f65-8c33-dce9d2cac262\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.501589 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzskb\" (UniqueName: \"kubernetes.io/projected/01389591-f485-4a18-a393-8f5d654ba5e7-kube-api-access-wzskb\") pod \"watcher-operator-controller-manager-7d5d9b469c-dh8bv\" (UID: \"01389591-f485-4a18-a393-8f5d654ba5e7\") " pod="openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.506457 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/050d6159-92eb-4f65-8c33-dce9d2cac262-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-5hv6c\" (UID: \"050d6159-92eb-4f65-8c33-dce9d2cac262\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.507961 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.516099 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.522613 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.540018 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpt7h\" (UniqueName: \"kubernetes.io/projected/461cd79d-16ba-4619-b2f0-1e4611a092e1-kube-api-access-lpt7h\") pod \"test-operator-controller-manager-5cd5cb47d7-6g488\" (UID: \"461cd79d-16ba-4619-b2f0-1e4611a092e1\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.563205 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.573480 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.597665 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.603252 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v94tn\" (UniqueName: \"kubernetes.io/projected/c997309f-800d-4180-aaa9-2594faef74ee-kube-api-access-v94tn\") pod \"openstack-operator-controller-manager-5b9c56875d-f9xpd\" (UID: \"c997309f-800d-4180-aaa9-2594faef74ee\") " pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.603295 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c997309f-800d-4180-aaa9-2594faef74ee-cert\") pod \"openstack-operator-controller-manager-5b9c56875d-f9xpd\" (UID: \"c997309f-800d-4180-aaa9-2594faef74ee\") " pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.603342 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrk22\" (UniqueName: \"kubernetes.io/projected/afea5a2a-fdd8-47a5-b09e-44f47b2f3f92-kube-api-access-lrk22\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl\" (UID: \"afea5a2a-fdd8-47a5-b09e-44f47b2f3f92\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.603400 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzskb\" (UniqueName: \"kubernetes.io/projected/01389591-f485-4a18-a393-8f5d654ba5e7-kube-api-access-wzskb\") pod \"watcher-operator-controller-manager-7d5d9b469c-dh8bv\" (UID: \"01389591-f485-4a18-a393-8f5d654ba5e7\") " pod="openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv" Oct 03 18:29:23 crc kubenswrapper[4835]: E1003 18:29:23.603864 4835 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 03 18:29:23 crc kubenswrapper[4835]: E1003 18:29:23.603905 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c997309f-800d-4180-aaa9-2594faef74ee-cert podName:c997309f-800d-4180-aaa9-2594faef74ee nodeName:}" failed. No retries permitted until 2025-10-03 18:29:24.103891408 +0000 UTC m=+905.819832280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c997309f-800d-4180-aaa9-2594faef74ee-cert") pod "openstack-operator-controller-manager-5b9c56875d-f9xpd" (UID: "c997309f-800d-4180-aaa9-2594faef74ee") : secret "webhook-server-cert" not found Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.639887 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v94tn\" (UniqueName: \"kubernetes.io/projected/c997309f-800d-4180-aaa9-2594faef74ee-kube-api-access-v94tn\") pod \"openstack-operator-controller-manager-5b9c56875d-f9xpd\" (UID: \"c997309f-800d-4180-aaa9-2594faef74ee\") " pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.642650 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzskb\" (UniqueName: \"kubernetes.io/projected/01389591-f485-4a18-a393-8f5d654ba5e7-kube-api-access-wzskb\") pod \"watcher-operator-controller-manager-7d5d9b469c-dh8bv\" (UID: \"01389591-f485-4a18-a393-8f5d654ba5e7\") " pod="openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.673346 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.706264 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrk22\" (UniqueName: \"kubernetes.io/projected/afea5a2a-fdd8-47a5-b09e-44f47b2f3f92-kube-api-access-lrk22\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl\" (UID: \"afea5a2a-fdd8-47a5-b09e-44f47b2f3f92\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.741832 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp"] Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.742427 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.767533 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrk22\" (UniqueName: \"kubernetes.io/projected/afea5a2a-fdd8-47a5-b09e-44f47b2f3f92-kube-api-access-lrk22\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl\" (UID: \"afea5a2a-fdd8-47a5-b09e-44f47b2f3f92\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.851372 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl" Oct 03 18:29:23 crc kubenswrapper[4835]: I1003 18:29:23.915475 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e96c6565-8432-4b47-bc1a-f7510415a0dd-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t\" (UID: \"e96c6565-8432-4b47-bc1a-f7510415a0dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" Oct 03 18:29:23 crc kubenswrapper[4835]: E1003 18:29:23.915618 4835 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 18:29:23 crc kubenswrapper[4835]: E1003 18:29:23.915669 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e96c6565-8432-4b47-bc1a-f7510415a0dd-cert podName:e96c6565-8432-4b47-bc1a-f7510415a0dd nodeName:}" failed. No retries permitted until 2025-10-03 18:29:24.915653551 +0000 UTC m=+906.631594423 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e96c6565-8432-4b47-bc1a-f7510415a0dd-cert") pod "openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" (UID: "e96c6565-8432-4b47-bc1a-f7510415a0dd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.031345 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.045621 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.050418 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-lpmkz"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.121222 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c997309f-800d-4180-aaa9-2594faef74ee-cert\") pod \"openstack-operator-controller-manager-5b9c56875d-f9xpd\" (UID: \"c997309f-800d-4180-aaa9-2594faef74ee\") " pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.130868 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c997309f-800d-4180-aaa9-2594faef74ee-cert\") pod \"openstack-operator-controller-manager-5b9c56875d-f9xpd\" (UID: \"c997309f-800d-4180-aaa9-2594faef74ee\") " pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.223885 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-5j994"] Oct 03 18:29:24 crc kubenswrapper[4835]: W1003 18:29:24.231957 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b594ae9_45e5_4ae6_b59e_8e1e44a182db.slice/crio-aefd1822a053ee611b79c92cb43d6a82b6cc6bbe104c623cc066441ce96b20d4 WatchSource:0}: Error finding container aefd1822a053ee611b79c92cb43d6a82b6cc6bbe104c623cc066441ce96b20d4: Status 404 returned error can't find the container with id aefd1822a053ee611b79c92cb43d6a82b6cc6bbe104c623cc066441ce96b20d4 Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.420113 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.428300 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.441300 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.445698 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8"] Oct 03 18:29:24 crc kubenswrapper[4835]: W1003 18:29:24.455703 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc248086e_e0fe_47f0_972e_8378189c018f.slice/crio-b9d1ac9f7661248107a1fb248226fe69ca30800e6636c0b7532ff699b2591df6 WatchSource:0}: Error finding container b9d1ac9f7661248107a1fb248226fe69ca30800e6636c0b7532ff699b2591df6: Status 404 returned error can't find the container with id b9d1ac9f7661248107a1fb248226fe69ca30800e6636c0b7532ff699b2591df6 Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.456961 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.462568 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.470122 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.473043 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.616830 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn"] Oct 03 18:29:24 crc kubenswrapper[4835]: W1003 18:29:24.625618 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d57a2da_d32b_4fea_8044_da2ab34d87d5.slice/crio-2919014806d512834d1cd2b8c7d609b147e8c0a9b7b90a2e894899597aa360fd WatchSource:0}: Error finding container 2919014806d512834d1cd2b8c7d609b147e8c0a9b7b90a2e894899597aa360fd: Status 404 returned error can't find the container with id 2919014806d512834d1cd2b8c7d609b147e8c0a9b7b90a2e894899597aa360fd Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.810732 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b" event={"ID":"85ad7a18-4a2c-4225-972b-5e12be17aee0","Type":"ContainerStarted","Data":"7dfe262ee0046fb777be4eac5b37075942e4b415d14cadb912bcee51c585f1df"} Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.812082 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj" event={"ID":"75049f02-3a34-4376-b2b2-1c447894b16c","Type":"ContainerStarted","Data":"29e7dff7e5842f0a5ec1df8047746674dc0a3ad34008f86cbcbe430e88ae42a0"} Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.813261 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k" event={"ID":"f7af64e9-1970-4672-8564-ba96ab371353","Type":"ContainerStarted","Data":"4ee21bfe089be807e742b55f40626d34beab7ffb2f169d13b4e8441c2b7b8d12"} Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.831774 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn" event={"ID":"3c7d1a89-7769-4031-abac-edd6b93bdd30","Type":"ContainerStarted","Data":"807e34830ea6cbd157d70f2c81e74c0d9b8f8ed4114b9a80dfecd81a6f8b37b1"} Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.840479 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4" event={"ID":"54c68b19-b66f-47af-808e-e3708ee36642","Type":"ContainerStarted","Data":"03c9d694d0d4bc4d021e770d8f7ac2d25d7fba4ebd8cb90eedd101c6a972692f"} Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.842951 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.843336 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-lpmkz" event={"ID":"9a423f22-efb7-4413-b0ba-886fb392aa5c","Type":"ContainerStarted","Data":"0ba934703154be5d19d3ec77b3557ca310c15f683700aa3f3530aa690e1619ac"} Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.845598 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd" event={"ID":"52d602a7-7c52-410b-b0d3-7a2233258474","Type":"ContainerStarted","Data":"ec3ff15be7aca877c36bfe99b1a0b6b12aba52c926da11265c1d1b3ceba7156f"} Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.846465 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp" event={"ID":"0d92cdf1-cce4-485e-9ed5-2539600d7e36","Type":"ContainerStarted","Data":"ca4abe1e3420f30f6c54ec17d2e897644b2bdbccd07699f9a77b27cac7cf55ef"} Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.848079 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.854941 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.857216 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn" event={"ID":"3d57a2da-d32b-4fea-8044-da2ab34d87d5","Type":"ContainerStarted","Data":"2919014806d512834d1cd2b8c7d609b147e8c0a9b7b90a2e894899597aa360fd"} Oct 03 18:29:24 crc kubenswrapper[4835]: W1003 18:29:24.857440 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod822f37e4_b1f2_40ed_a075_cdff171e42bd.slice/crio-1de33d6dcae7c3f958f3b584916f65bdb39da6f8cb5b9b49addd4c631baa286d WatchSource:0}: Error finding container 1de33d6dcae7c3f958f3b584916f65bdb39da6f8cb5b9b49addd4c631baa286d: Status 404 returned error can't find the container with id 1de33d6dcae7c3f958f3b584916f65bdb39da6f8cb5b9b49addd4c631baa286d Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.859223 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-5j994" event={"ID":"6b594ae9-45e5-4ae6-b59e-8e1e44a182db","Type":"ContainerStarted","Data":"aefd1822a053ee611b79c92cb43d6a82b6cc6bbe104c623cc066441ce96b20d4"} Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.859311 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.860694 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx" event={"ID":"967b019a-36f2-4dda-8d6f-968cfb65f954","Type":"ContainerStarted","Data":"83aa059dd636601043fa5aaf4cadc0e2d673a9c5d4a4dde1cde15daf83390442"} Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.861288 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 18:29:24 crc kubenswrapper[4835]: W1003 18:29:24.861869 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod461cd79d_16ba_4619_b2f0_1e4611a092e1.slice/crio-1e590e0e4a34339077179d4529abad0366f65b7016b8a952aad517b1ab538379 WatchSource:0}: Error finding container 1e590e0e4a34339077179d4529abad0366f65b7016b8a952aad517b1ab538379: Status 404 returned error can't find the container with id 1e590e0e4a34339077179d4529abad0366f65b7016b8a952aad517b1ab538379 Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.862885 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8" event={"ID":"f2b0de06-963e-4fd4-8056-3272f20f3ef8","Type":"ContainerStarted","Data":"10ba03d42f31602f30c4e0816ab655c87e282f6ad54d4dd6e10509f0772b56e7"} Oct 03 18:29:24 crc kubenswrapper[4835]: W1003 18:29:24.864014 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad1db9d5_da6c_4498_8bb4_176c87481a4d.slice/crio-4fd65e2580b01198d767048130fb8265d561321f38ed1e4eee15be180d28e7df WatchSource:0}: Error finding container 4fd65e2580b01198d767048130fb8265d561321f38ed1e4eee15be180d28e7df: Status 404 returned error can't find the container with id 4fd65e2580b01198d767048130fb8265d561321f38ed1e4eee15be180d28e7df Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.864050 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8"] Oct 03 18:29:24 crc kubenswrapper[4835]: E1003 18:29:24.865468 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lpt7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd5cb47d7-6g488_openstack-operators(461cd79d-16ba-4619-b2f0-1e4611a092e1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 18:29:24 crc kubenswrapper[4835]: E1003 18:29:24.868586 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ntl8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5db5cf686f-bzmxq_openstack-operators(ad1db9d5-da6c-4498-8bb4-176c87481a4d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.870760 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx" event={"ID":"c248086e-e0fe-47f0-972e-8378189c018f","Type":"ContainerStarted","Data":"b9d1ac9f7661248107a1fb248226fe69ca30800e6636c0b7532ff699b2591df6"} Oct 03 18:29:24 crc kubenswrapper[4835]: W1003 18:29:24.871270 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod050d6159_92eb_4f65_8c33_dce9d2cac262.slice/crio-03ad7a73f89a826c9c672fd353828a03ce0516b8d45304574f3fce7800d54a15 WatchSource:0}: Error finding container 03ad7a73f89a826c9c672fd353828a03ce0516b8d45304574f3fce7800d54a15: Status 404 returned error can't find the container with id 03ad7a73f89a826c9c672fd353828a03ce0516b8d45304574f3fce7800d54a15 Oct 03 18:29:24 crc kubenswrapper[4835]: E1003 18:29:24.873496 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f2r9x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-fljf9_openstack-operators(cebb3329-2036-4540-9444-d47294d13aff): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 18:29:24 crc kubenswrapper[4835]: E1003 18:29:24.875043 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rgtsw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-5fbf469cd7-5hv6c_openstack-operators(050d6159-92eb-4f65-8c33-dce9d2cac262): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.939539 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e96c6565-8432-4b47-bc1a-f7510415a0dd-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t\" (UID: \"e96c6565-8432-4b47-bc1a-f7510415a0dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.947285 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.952880 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e96c6565-8432-4b47-bc1a-f7510415a0dd-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t\" (UID: \"e96c6565-8432-4b47-bc1a-f7510415a0dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.963234 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.969283 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx"] Oct 03 18:29:24 crc kubenswrapper[4835]: I1003 18:29:24.990271 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.004729 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd"] Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.027059 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lrk22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl_openstack-operators(afea5a2a-fdd8-47a5-b09e-44f47b2f3f92): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.027889 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:725da67b3f9cf2758564e0111928cdd570c0f6f1ca34775f159bbe94deb82548,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9nsw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-7d8bb7f44c-xdgkx_openstack-operators(11585794-4db5-4e34-aeaa-24036489269b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.029193 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl" podUID="afea5a2a-fdd8-47a5-b09e-44f47b2f3f92" Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.075280 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" podUID="461cd79d-16ba-4619-b2f0-1e4611a092e1" Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.079211 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" podUID="cebb3329-2036-4540-9444-d47294d13aff" Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.097390 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" podUID="ad1db9d5-da6c-4498-8bb4-176c87481a4d" Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.177218 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" podUID="050d6159-92eb-4f65-8c33-dce9d2cac262" Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.271189 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" podUID="11585794-4db5-4e34-aeaa-24036489269b" Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.533976 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t"] Oct 03 18:29:25 crc kubenswrapper[4835]: W1003 18:29:25.544276 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode96c6565_8432_4b47_bc1a_f7510415a0dd.slice/crio-2f0c22bef7de353664a747413efe94a4e032cbe538a0209b3502ce6f969f6722 WatchSource:0}: Error finding container 2f0c22bef7de353664a747413efe94a4e032cbe538a0209b3502ce6f969f6722: Status 404 returned error can't find the container with id 2f0c22bef7de353664a747413efe94a4e032cbe538a0209b3502ce6f969f6722 Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.895694 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" event={"ID":"c997309f-800d-4180-aaa9-2594faef74ee","Type":"ContainerStarted","Data":"380d3f96814efc1e883d6c2941e0d551a28e9a177d8cce8377b775c49a49f75e"} Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.896016 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" event={"ID":"c997309f-800d-4180-aaa9-2594faef74ee","Type":"ContainerStarted","Data":"a814951c83adb74f41d8f8c288606e705dc04d7c9007d312a67331291918a1d4"} Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.896032 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" event={"ID":"c997309f-800d-4180-aaa9-2594faef74ee","Type":"ContainerStarted","Data":"ed9e1d48c23adbb92a4f55481977520395a01a66be8e21b394c889673727114d"} Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.896147 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.900260 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8" event={"ID":"822f37e4-b1f2-40ed-a075-cdff171e42bd","Type":"ContainerStarted","Data":"1de33d6dcae7c3f958f3b584916f65bdb39da6f8cb5b9b49addd4c631baa286d"} Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.902438 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" event={"ID":"e96c6565-8432-4b47-bc1a-f7510415a0dd","Type":"ContainerStarted","Data":"2f0c22bef7de353664a747413efe94a4e032cbe538a0209b3502ce6f969f6722"} Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.904021 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" event={"ID":"cebb3329-2036-4540-9444-d47294d13aff","Type":"ContainerStarted","Data":"4c300c8c3ecbd7c82edd3edf2f961f08282874085aa616f6d0d4615af6e246b6"} Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.904048 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" event={"ID":"cebb3329-2036-4540-9444-d47294d13aff","Type":"ContainerStarted","Data":"8c45d3b31eae6af28e97fd7b74b77ae9be1be6a10e8059dd7742f152e8739645"} Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.906127 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" podUID="cebb3329-2036-4540-9444-d47294d13aff" Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.906496 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" event={"ID":"461cd79d-16ba-4619-b2f0-1e4611a092e1","Type":"ContainerStarted","Data":"0a183641fc562388b6f365e92f62f1f71aed43b97a421e074e8011b84184a251"} Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.906529 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" event={"ID":"461cd79d-16ba-4619-b2f0-1e4611a092e1","Type":"ContainerStarted","Data":"1e590e0e4a34339077179d4529abad0366f65b7016b8a952aad517b1ab538379"} Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.907571 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" podUID="461cd79d-16ba-4619-b2f0-1e4611a092e1" Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.908804 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" event={"ID":"050d6159-92eb-4f65-8c33-dce9d2cac262","Type":"ContainerStarted","Data":"f471326ad7156a60ec6512f2eb357d8dfd7074da1a997e03d0769089b3045c7b"} Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.908869 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" event={"ID":"050d6159-92eb-4f65-8c33-dce9d2cac262","Type":"ContainerStarted","Data":"03ad7a73f89a826c9c672fd353828a03ce0516b8d45304574f3fce7800d54a15"} Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.910192 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv" event={"ID":"01389591-f485-4a18-a393-8f5d654ba5e7","Type":"ContainerStarted","Data":"95d2c09dae2bd8cdec8623638a2be18cde0da1b61d980112372c9d046eb5b50f"} Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.911271 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475\\\"\"" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" podUID="050d6159-92eb-4f65-8c33-dce9d2cac262" Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.915592 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" event={"ID":"ad1db9d5-da6c-4498-8bb4-176c87481a4d","Type":"ContainerStarted","Data":"21b88c18a5b4be2700913f53ddba143a0fe815ac0d33679e1199c187d661f103"} Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.915633 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" event={"ID":"ad1db9d5-da6c-4498-8bb4-176c87481a4d","Type":"ContainerStarted","Data":"4fd65e2580b01198d767048130fb8265d561321f38ed1e4eee15be180d28e7df"} Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.918279 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl" event={"ID":"afea5a2a-fdd8-47a5-b09e-44f47b2f3f92","Type":"ContainerStarted","Data":"92dfb4022abf088f14ea55b00b762bf84f42a48b5ff371e7928c39acb904ca53"} Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.920878 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl" podUID="afea5a2a-fdd8-47a5-b09e-44f47b2f3f92" Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.920879 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" podUID="ad1db9d5-da6c-4498-8bb4-176c87481a4d" Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.923317 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" event={"ID":"11585794-4db5-4e34-aeaa-24036489269b","Type":"ContainerStarted","Data":"306815c7aa8c28eb4f68a6613dcdbb477879360dd3e798ecc92caaa01a2f275e"} Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.923359 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" event={"ID":"11585794-4db5-4e34-aeaa-24036489269b","Type":"ContainerStarted","Data":"9a0cf5d710123d73688a1e672928ab818da34d9833b279e91b9bf333ee03d7e4"} Oct 03 18:29:25 crc kubenswrapper[4835]: E1003 18:29:25.930311 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:725da67b3f9cf2758564e0111928cdd570c0f6f1ca34775f159bbe94deb82548\\\"\"" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" podUID="11585794-4db5-4e34-aeaa-24036489269b" Oct 03 18:29:25 crc kubenswrapper[4835]: I1003 18:29:25.944464 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" podStartSLOduration=2.944446478 podStartE2EDuration="2.944446478s" podCreationTimestamp="2025-10-03 18:29:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:29:25.924278298 +0000 UTC m=+907.640219180" watchObservedRunningTime="2025-10-03 18:29:25.944446478 +0000 UTC m=+907.660387350" Oct 03 18:29:26 crc kubenswrapper[4835]: E1003 18:29:26.936383 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" podUID="461cd79d-16ba-4619-b2f0-1e4611a092e1" Oct 03 18:29:26 crc kubenswrapper[4835]: E1003 18:29:26.936499 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475\\\"\"" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" podUID="050d6159-92eb-4f65-8c33-dce9d2cac262" Oct 03 18:29:26 crc kubenswrapper[4835]: E1003 18:29:26.936570 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" podUID="ad1db9d5-da6c-4498-8bb4-176c87481a4d" Oct 03 18:29:26 crc kubenswrapper[4835]: E1003 18:29:26.936629 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:725da67b3f9cf2758564e0111928cdd570c0f6f1ca34775f159bbe94deb82548\\\"\"" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" podUID="11585794-4db5-4e34-aeaa-24036489269b" Oct 03 18:29:26 crc kubenswrapper[4835]: E1003 18:29:26.936824 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" podUID="cebb3329-2036-4540-9444-d47294d13aff" Oct 03 18:29:26 crc kubenswrapper[4835]: E1003 18:29:26.936929 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl" podUID="afea5a2a-fdd8-47a5-b09e-44f47b2f3f92" Oct 03 18:29:34 crc kubenswrapper[4835]: I1003 18:29:34.434775 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5b9c56875d-f9xpd" Oct 03 18:29:35 crc kubenswrapper[4835]: I1003 18:29:35.358833 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:29:35 crc kubenswrapper[4835]: I1003 18:29:35.358886 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.031975 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8" event={"ID":"822f37e4-b1f2-40ed-a075-cdff171e42bd","Type":"ContainerStarted","Data":"3b27c94527a9dc735753b8e0fc1df0e68b30ae513aec0aa9749c53100b0fada7"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.041695 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn" event={"ID":"3c7d1a89-7769-4031-abac-edd6b93bdd30","Type":"ContainerStarted","Data":"e6ee3b43e13e0cf4da3caea69cc2984f54e59419ecaf1b09c163aaa49b5a9f44"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.045359 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv" event={"ID":"01389591-f485-4a18-a393-8f5d654ba5e7","Type":"ContainerStarted","Data":"08bac73a1cf753693b36550bb08e1ced03455698dea988ae8bf229afaadb5979"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.074227 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8" event={"ID":"f2b0de06-963e-4fd4-8056-3272f20f3ef8","Type":"ContainerStarted","Data":"fb6e161973fb2bcb03fad9d61de9ae5c44f8ec32c8b5ca72a497e8be37f34c9c"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.102598 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-lpmkz" event={"ID":"9a423f22-efb7-4413-b0ba-886fb392aa5c","Type":"ContainerStarted","Data":"cf7a753ed81cdd672c0801a0817dd6a548649a990003832b279d32ecb3c8aef6"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.126238 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx" event={"ID":"967b019a-36f2-4dda-8d6f-968cfb65f954","Type":"ContainerStarted","Data":"c8a2b88080d0e817f4b40a6374da4b44f110b0394a642b0f61dc3607c19e834b"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.141115 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4" event={"ID":"54c68b19-b66f-47af-808e-e3708ee36642","Type":"ContainerStarted","Data":"686bf5c8df5a4121cd6813c849e869979c37b41ac12c11b09e2c5535aea8f1bf"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.161311 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj" event={"ID":"75049f02-3a34-4376-b2b2-1c447894b16c","Type":"ContainerStarted","Data":"c7c37910b1d9c3398a6eb10322563b0e9d28a008d9f03301a9f93fee631702a4"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.177406 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn" event={"ID":"3d57a2da-d32b-4fea-8044-da2ab34d87d5","Type":"ContainerStarted","Data":"499ef52d77cb157d7e145b89ce349ae22602a8b59538a2b10338ffba35ccbdd5"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.187357 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b" event={"ID":"85ad7a18-4a2c-4225-972b-5e12be17aee0","Type":"ContainerStarted","Data":"6d8a5880a7e7928c83d79276da9cdfaea2b12c4c4684ce97ff518d9ad889b549"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.188394 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k" event={"ID":"f7af64e9-1970-4672-8564-ba96ab371353","Type":"ContainerStarted","Data":"7c0425a33b98ed8a62339eed5c664238e495e2fe0ec29c2ff953a38d9792bfb0"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.189293 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-5j994" event={"ID":"6b594ae9-45e5-4ae6-b59e-8e1e44a182db","Type":"ContainerStarted","Data":"bb1ff22f3a201a72ecb992d3b2b71de620ecacdf9ff0424c62db723e2dd5bbf7"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.203346 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx" event={"ID":"c248086e-e0fe-47f0-972e-8378189c018f","Type":"ContainerStarted","Data":"83b6ca1be364ad0e2d64770aa46cd2b485e79e3f49a73b59118bb2b00e196115"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.221604 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp" event={"ID":"0d92cdf1-cce4-485e-9ed5-2539600d7e36","Type":"ContainerStarted","Data":"3fe82227282669c23a4b8226d734fee6cab9968c8ed0b962beeb9a40c8d5cc2e"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.239021 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" event={"ID":"e96c6565-8432-4b47-bc1a-f7510415a0dd","Type":"ContainerStarted","Data":"880d335dc1c8dcef2d305bdf63396423aa3d4435491a057758d4fb991fbe729b"} Oct 03 18:29:37 crc kubenswrapper[4835]: I1003 18:29:37.253582 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd" event={"ID":"52d602a7-7c52-410b-b0d3-7a2233258474","Type":"ContainerStarted","Data":"4afa1bb3cd0e3ba4a6ff853e9fb5f8e4844daafaf760cbbe510746d835ccf89b"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.260655 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k" event={"ID":"f7af64e9-1970-4672-8564-ba96ab371353","Type":"ContainerStarted","Data":"ed52dc67a9c7f4e5dc0ffa4b8ad60158819f4087870e677c9cda4c36f2741dc4"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.261431 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.262350 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8" event={"ID":"822f37e4-b1f2-40ed-a075-cdff171e42bd","Type":"ContainerStarted","Data":"7c767bc4c294b415debbebfcd7917f8bc7e0fc05b72def432348725de15bee31"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.262594 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.263915 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4" event={"ID":"54c68b19-b66f-47af-808e-e3708ee36642","Type":"ContainerStarted","Data":"3063a56791eed729ef3d0abd5b28f5eb0eb4f0e521d26d026dfdde0758962dd6"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.264027 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.265601 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" event={"ID":"e96c6565-8432-4b47-bc1a-f7510415a0dd","Type":"ContainerStarted","Data":"d1705f9e35a0e7144f371480493d5214cd1826a88eeacf9f2e74109bdc01fd59"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.266188 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.267529 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj" event={"ID":"75049f02-3a34-4376-b2b2-1c447894b16c","Type":"ContainerStarted","Data":"b830bcc40e7644f716dc94539d5a5d4c206089bf4c19fc535c18acb0ffe05965"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.267652 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.269047 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp" event={"ID":"0d92cdf1-cce4-485e-9ed5-2539600d7e36","Type":"ContainerStarted","Data":"ef7a673599dbe1ceaba99526d53cafbf1760dc30ff8621c8ed52b62b63f8b74e"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.269295 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.270696 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-lpmkz" event={"ID":"9a423f22-efb7-4413-b0ba-886fb392aa5c","Type":"ContainerStarted","Data":"fa4536cd66510841c26829a609815f504cfcd31b3cf040fe8fabdcfdc7ed4c71"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.272380 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd" event={"ID":"52d602a7-7c52-410b-b0d3-7a2233258474","Type":"ContainerStarted","Data":"df12a20695a5efedea9b350bea5a15b0f8785296960123866c545344710d6f87"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.272825 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.274101 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn" event={"ID":"3d57a2da-d32b-4fea-8044-da2ab34d87d5","Type":"ContainerStarted","Data":"47f1133b9e4c1313d9151563fa4b061d771954c5d83ff69417c8eaa5d6794254"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.274318 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.275614 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn" event={"ID":"3c7d1a89-7769-4031-abac-edd6b93bdd30","Type":"ContainerStarted","Data":"26bcd762f501cba1accd71c1193a18806ba4a616557b6b2cfb531181f8d465d7"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.275856 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.277306 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx" event={"ID":"967b019a-36f2-4dda-8d6f-968cfb65f954","Type":"ContainerStarted","Data":"57308e7cc142618d3e5ced2d68614295bb055821c8239433358a38e21dd6f164"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.278582 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b" event={"ID":"85ad7a18-4a2c-4225-972b-5e12be17aee0","Type":"ContainerStarted","Data":"ad99da3d11a04f7bb38cc27bcf182ca96b4116e5d672ca9f5d3a6c5b48b0b913"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.278705 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.279936 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-5j994" event={"ID":"6b594ae9-45e5-4ae6-b59e-8e1e44a182db","Type":"ContainerStarted","Data":"827b7fb758b1c11077a849a4b9b7c5ad0ff313cc647dc9a66de9791c500e61f8"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.280049 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-5j994" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.281291 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8" event={"ID":"f2b0de06-963e-4fd4-8056-3272f20f3ef8","Type":"ContainerStarted","Data":"19fd2345d03c630aa124b4026645d63e065f78573a029e8bd3609b54f1337232"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.281719 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.283041 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx" event={"ID":"c248086e-e0fe-47f0-972e-8378189c018f","Type":"ContainerStarted","Data":"efd8270d090d90b122a93a094aca7cb56a6ebffeb483d9ac0fb4d5f2f342dc06"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.283325 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.284588 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv" event={"ID":"01389591-f485-4a18-a393-8f5d654ba5e7","Type":"ContainerStarted","Data":"dd1f5f4e888739952752b61cb27f880ee76cfa4157045b8ea47a002c69d6ee7f"} Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.284784 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.291209 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k" podStartSLOduration=4.451696804 podStartE2EDuration="16.291193594s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.146775716 +0000 UTC m=+905.862716588" lastFinishedPulling="2025-10-03 18:29:35.986272506 +0000 UTC m=+917.702213378" observedRunningTime="2025-10-03 18:29:38.287556033 +0000 UTC m=+920.003496905" watchObservedRunningTime="2025-10-03 18:29:38.291193594 +0000 UTC m=+920.007134466" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.302591 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn" podStartSLOduration=4.800495634 podStartE2EDuration="16.302576516s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.483053536 +0000 UTC m=+906.198994408" lastFinishedPulling="2025-10-03 18:29:35.985134428 +0000 UTC m=+917.701075290" observedRunningTime="2025-10-03 18:29:38.301788096 +0000 UTC m=+920.017728968" watchObservedRunningTime="2025-10-03 18:29:38.302576516 +0000 UTC m=+920.018517388" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.318984 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b" podStartSLOduration=4.817173757 podStartE2EDuration="16.318967522s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.483384134 +0000 UTC m=+906.199325006" lastFinishedPulling="2025-10-03 18:29:35.985177899 +0000 UTC m=+917.701118771" observedRunningTime="2025-10-03 18:29:38.317490055 +0000 UTC m=+920.033430927" watchObservedRunningTime="2025-10-03 18:29:38.318967522 +0000 UTC m=+920.034908394" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.372033 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-599898f689-lpmkz" podStartSLOduration=4.475678708 podStartE2EDuration="16.372017066s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.147842062 +0000 UTC m=+905.863782934" lastFinishedPulling="2025-10-03 18:29:36.04418041 +0000 UTC m=+917.760121292" observedRunningTime="2025-10-03 18:29:38.345772886 +0000 UTC m=+920.061713758" watchObservedRunningTime="2025-10-03 18:29:38.372017066 +0000 UTC m=+920.087957938" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.372628 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj" podStartSLOduration=4.869508464 podStartE2EDuration="16.372622171s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.483146609 +0000 UTC m=+906.199087481" lastFinishedPulling="2025-10-03 18:29:35.986260316 +0000 UTC m=+917.702201188" observedRunningTime="2025-10-03 18:29:38.370345224 +0000 UTC m=+920.086286106" watchObservedRunningTime="2025-10-03 18:29:38.372622171 +0000 UTC m=+920.088563043" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.399261 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" podStartSLOduration=5.913514955 podStartE2EDuration="16.3992432s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:25.559295207 +0000 UTC m=+907.275236079" lastFinishedPulling="2025-10-03 18:29:36.045023452 +0000 UTC m=+917.760964324" observedRunningTime="2025-10-03 18:29:38.393315024 +0000 UTC m=+920.109255896" watchObservedRunningTime="2025-10-03 18:29:38.3992432 +0000 UTC m=+920.115184072" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.413887 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-5j994" podStartSLOduration=4.63805307 podStartE2EDuration="16.413866102s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.235144745 +0000 UTC m=+905.951085607" lastFinishedPulling="2025-10-03 18:29:36.010957767 +0000 UTC m=+917.726898639" observedRunningTime="2025-10-03 18:29:38.410747195 +0000 UTC m=+920.126688057" watchObservedRunningTime="2025-10-03 18:29:38.413866102 +0000 UTC m=+920.129806974" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.443783 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp" podStartSLOduration=4.457823575 podStartE2EDuration="16.443762083s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:23.999253521 +0000 UTC m=+905.715194393" lastFinishedPulling="2025-10-03 18:29:35.985192029 +0000 UTC m=+917.701132901" observedRunningTime="2025-10-03 18:29:38.440213355 +0000 UTC m=+920.156154227" watchObservedRunningTime="2025-10-03 18:29:38.443762083 +0000 UTC m=+920.159702955" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.466971 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4" podStartSLOduration=4.907678249 podStartE2EDuration="16.466955577s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.452311595 +0000 UTC m=+906.168252467" lastFinishedPulling="2025-10-03 18:29:36.011588933 +0000 UTC m=+917.727529795" observedRunningTime="2025-10-03 18:29:38.458772364 +0000 UTC m=+920.174713236" watchObservedRunningTime="2025-10-03 18:29:38.466955577 +0000 UTC m=+920.182896439" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.479553 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd" podStartSLOduration=4.607588516 podStartE2EDuration="16.479526829s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.13481809 +0000 UTC m=+905.850758962" lastFinishedPulling="2025-10-03 18:29:36.006756403 +0000 UTC m=+917.722697275" observedRunningTime="2025-10-03 18:29:38.479526539 +0000 UTC m=+920.195467411" watchObservedRunningTime="2025-10-03 18:29:38.479526829 +0000 UTC m=+920.195467701" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.516528 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx" podStartSLOduration=4.905230319 podStartE2EDuration="16.516509205s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.431690594 +0000 UTC m=+906.147631466" lastFinishedPulling="2025-10-03 18:29:36.04296948 +0000 UTC m=+917.758910352" observedRunningTime="2025-10-03 18:29:38.510372513 +0000 UTC m=+920.226313375" watchObservedRunningTime="2025-10-03 18:29:38.516509205 +0000 UTC m=+920.232450077" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.527968 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx" podStartSLOduration=4.945702142 podStartE2EDuration="16.527950519s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.457555495 +0000 UTC m=+906.173496367" lastFinishedPulling="2025-10-03 18:29:36.039803882 +0000 UTC m=+917.755744744" observedRunningTime="2025-10-03 18:29:38.525040327 +0000 UTC m=+920.240981199" watchObservedRunningTime="2025-10-03 18:29:38.527950519 +0000 UTC m=+920.243891391" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.547798 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8" podStartSLOduration=5.366778803 podStartE2EDuration="16.54778338s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.860935667 +0000 UTC m=+906.576876539" lastFinishedPulling="2025-10-03 18:29:36.041940244 +0000 UTC m=+917.757881116" observedRunningTime="2025-10-03 18:29:38.545000441 +0000 UTC m=+920.260941313" watchObservedRunningTime="2025-10-03 18:29:38.54778338 +0000 UTC m=+920.263724242" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.568402 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8" podStartSLOduration=5.008357255 podStartE2EDuration="16.568390541s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.452005158 +0000 UTC m=+906.167946030" lastFinishedPulling="2025-10-03 18:29:36.012038454 +0000 UTC m=+917.727979316" observedRunningTime="2025-10-03 18:29:38.566967905 +0000 UTC m=+920.282908777" watchObservedRunningTime="2025-10-03 18:29:38.568390541 +0000 UTC m=+920.284331413" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.588916 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn" podStartSLOduration=5.174930059 podStartE2EDuration="16.588896108s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.630232632 +0000 UTC m=+906.346173504" lastFinishedPulling="2025-10-03 18:29:36.044198691 +0000 UTC m=+917.760139553" observedRunningTime="2025-10-03 18:29:38.586118829 +0000 UTC m=+920.302059701" watchObservedRunningTime="2025-10-03 18:29:38.588896108 +0000 UTC m=+920.304836980" Oct 03 18:29:38 crc kubenswrapper[4835]: I1003 18:29:38.609529 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv" podStartSLOduration=4.566948631 podStartE2EDuration="15.609511869s" podCreationTimestamp="2025-10-03 18:29:23 +0000 UTC" firstStartedPulling="2025-10-03 18:29:25.026845737 +0000 UTC m=+906.742786609" lastFinishedPulling="2025-10-03 18:29:36.069408975 +0000 UTC m=+917.785349847" observedRunningTime="2025-10-03 18:29:38.60433981 +0000 UTC m=+920.320280962" watchObservedRunningTime="2025-10-03 18:29:38.609511869 +0000 UTC m=+920.325452741" Oct 03 18:29:39 crc kubenswrapper[4835]: I1003 18:29:39.292184 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-599898f689-lpmkz" Oct 03 18:29:39 crc kubenswrapper[4835]: I1003 18:29:39.292501 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx" Oct 03 18:29:41 crc kubenswrapper[4835]: I1003 18:29:41.305573 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" event={"ID":"050d6159-92eb-4f65-8c33-dce9d2cac262","Type":"ContainerStarted","Data":"36970e7ff1a88734fbee4b3dab1c2ddc3512a7bbfb314b9eb79bd17235f0a387"} Oct 03 18:29:41 crc kubenswrapper[4835]: I1003 18:29:41.306930 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" Oct 03 18:29:41 crc kubenswrapper[4835]: I1003 18:29:41.308592 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" event={"ID":"cebb3329-2036-4540-9444-d47294d13aff","Type":"ContainerStarted","Data":"2ddd213e265f9d9df3942aa44fa9f900a5cdf2aef86965ccbaf028c051b5a973"} Oct 03 18:29:41 crc kubenswrapper[4835]: I1003 18:29:41.308941 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" Oct 03 18:29:41 crc kubenswrapper[4835]: I1003 18:29:41.339789 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" podStartSLOduration=3.851319721 podStartE2EDuration="19.339772973s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.874924544 +0000 UTC m=+906.590865416" lastFinishedPulling="2025-10-03 18:29:40.363377796 +0000 UTC m=+922.079318668" observedRunningTime="2025-10-03 18:29:41.33034755 +0000 UTC m=+923.046288412" watchObservedRunningTime="2025-10-03 18:29:41.339772973 +0000 UTC m=+923.055713845" Oct 03 18:29:41 crc kubenswrapper[4835]: I1003 18:29:41.351321 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" podStartSLOduration=3.8676654470000003 podStartE2EDuration="19.351305209s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.873247373 +0000 UTC m=+906.589188245" lastFinishedPulling="2025-10-03 18:29:40.356887135 +0000 UTC m=+922.072828007" observedRunningTime="2025-10-03 18:29:41.346667674 +0000 UTC m=+923.062608546" watchObservedRunningTime="2025-10-03 18:29:41.351305209 +0000 UTC m=+923.067246081" Oct 03 18:29:42 crc kubenswrapper[4835]: I1003 18:29:42.944646 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-8b5bp" Oct 03 18:29:42 crc kubenswrapper[4835]: I1003 18:29:42.965941 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-2w59k" Oct 03 18:29:42 crc kubenswrapper[4835]: I1003 18:29:42.967600 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-h6txd" Oct 03 18:29:42 crc kubenswrapper[4835]: I1003 18:29:42.985782 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-5j994" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.002994 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-599898f689-lpmkz" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.127246 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-9x4rx" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.274375 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-6dbbn" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.328626 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-dnbkj" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.331114 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-bxwnx" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.331398 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-g5nh8" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.337379 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" event={"ID":"461cd79d-16ba-4619-b2f0-1e4611a092e1","Type":"ContainerStarted","Data":"5a4051c5a01806bb5ab049e9489405d79cabac02b105e89bd1a8f7f26fe6d898"} Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.337595 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.343058 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" event={"ID":"ad1db9d5-da6c-4498-8bb4-176c87481a4d","Type":"ContainerStarted","Data":"953a146ea5523db51ceab54e2143e9b6cbcf92a8164600c159cb16814183839d"} Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.343668 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.372199 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" podStartSLOduration=2.949542025 podStartE2EDuration="20.372180141s" podCreationTimestamp="2025-10-03 18:29:23 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.865329606 +0000 UTC m=+906.581270478" lastFinishedPulling="2025-10-03 18:29:42.287967722 +0000 UTC m=+924.003908594" observedRunningTime="2025-10-03 18:29:43.371828692 +0000 UTC m=+925.087769564" watchObservedRunningTime="2025-10-03 18:29:43.372180141 +0000 UTC m=+925.088121013" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.380618 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-rrqt8" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.393187 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" podStartSLOduration=2.996041287 podStartE2EDuration="20.393166701s" podCreationTimestamp="2025-10-03 18:29:23 +0000 UTC" firstStartedPulling="2025-10-03 18:29:24.868494384 +0000 UTC m=+906.584435256" lastFinishedPulling="2025-10-03 18:29:42.265619798 +0000 UTC m=+923.981560670" observedRunningTime="2025-10-03 18:29:43.391535841 +0000 UTC m=+925.107476713" watchObservedRunningTime="2025-10-03 18:29:43.393166701 +0000 UTC m=+925.109107573" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.405873 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-xrb5b" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.463302 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-dmgs4" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.513758 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-6t4pn" Oct 03 18:29:43 crc kubenswrapper[4835]: I1003 18:29:43.747411 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7d5d9b469c-dh8bv" Oct 03 18:29:44 crc kubenswrapper[4835]: I1003 18:29:44.996903 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t" Oct 03 18:29:45 crc kubenswrapper[4835]: I1003 18:29:45.359975 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" event={"ID":"11585794-4db5-4e34-aeaa-24036489269b","Type":"ContainerStarted","Data":"d24df3f7d83e1bb471f912e0429d187330a9edb179c221c3b9d2c403c9da6b7d"} Oct 03 18:29:45 crc kubenswrapper[4835]: I1003 18:29:45.361202 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" Oct 03 18:29:45 crc kubenswrapper[4835]: I1003 18:29:45.363173 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl" event={"ID":"afea5a2a-fdd8-47a5-b09e-44f47b2f3f92","Type":"ContainerStarted","Data":"5169642f4e0181cb220975b99a263bf9edd3d33f99d7b41be668c3d8771ebea9"} Oct 03 18:29:45 crc kubenswrapper[4835]: I1003 18:29:45.383961 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" podStartSLOduration=3.938712477 podStartE2EDuration="23.383946177s" podCreationTimestamp="2025-10-03 18:29:22 +0000 UTC" firstStartedPulling="2025-10-03 18:29:25.02774959 +0000 UTC m=+906.743690462" lastFinishedPulling="2025-10-03 18:29:44.47298329 +0000 UTC m=+926.188924162" observedRunningTime="2025-10-03 18:29:45.380253355 +0000 UTC m=+927.096194227" watchObservedRunningTime="2025-10-03 18:29:45.383946177 +0000 UTC m=+927.099887049" Oct 03 18:29:45 crc kubenswrapper[4835]: I1003 18:29:45.407254 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl" podStartSLOduration=2.967079258 podStartE2EDuration="22.407215053s" podCreationTimestamp="2025-10-03 18:29:23 +0000 UTC" firstStartedPulling="2025-10-03 18:29:25.026892708 +0000 UTC m=+906.742833580" lastFinishedPulling="2025-10-03 18:29:44.467028493 +0000 UTC m=+926.182969375" observedRunningTime="2025-10-03 18:29:45.39863215 +0000 UTC m=+927.114573022" watchObservedRunningTime="2025-10-03 18:29:45.407215053 +0000 UTC m=+927.123155965" Oct 03 18:29:53 crc kubenswrapper[4835]: I1003 18:29:53.525751 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-xdgkx" Oct 03 18:29:53 crc kubenswrapper[4835]: I1003 18:29:53.566474 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-fljf9" Oct 03 18:29:53 crc kubenswrapper[4835]: I1003 18:29:53.577995 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-bzmxq" Oct 03 18:29:53 crc kubenswrapper[4835]: I1003 18:29:53.599733 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-6g488" Oct 03 18:29:53 crc kubenswrapper[4835]: I1003 18:29:53.688752 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-5hv6c" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.152283 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr"] Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.153996 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.156268 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.156470 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.179556 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr"] Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.257152 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-config-volume\") pod \"collect-profiles-29325270-9jcbr\" (UID: \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.257217 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-secret-volume\") pod \"collect-profiles-29325270-9jcbr\" (UID: \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.257254 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2fld\" (UniqueName: \"kubernetes.io/projected/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-kube-api-access-s2fld\") pod \"collect-profiles-29325270-9jcbr\" (UID: \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.358853 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2fld\" (UniqueName: \"kubernetes.io/projected/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-kube-api-access-s2fld\") pod \"collect-profiles-29325270-9jcbr\" (UID: \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.358952 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-config-volume\") pod \"collect-profiles-29325270-9jcbr\" (UID: \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.358992 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-secret-volume\") pod \"collect-profiles-29325270-9jcbr\" (UID: \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.359947 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-config-volume\") pod \"collect-profiles-29325270-9jcbr\" (UID: \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.364063 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-secret-volume\") pod \"collect-profiles-29325270-9jcbr\" (UID: \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.375685 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2fld\" (UniqueName: \"kubernetes.io/projected/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-kube-api-access-s2fld\") pod \"collect-profiles-29325270-9jcbr\" (UID: \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.489398 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" Oct 03 18:30:00 crc kubenswrapper[4835]: I1003 18:30:00.896732 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr"] Oct 03 18:30:01 crc kubenswrapper[4835]: I1003 18:30:01.467899 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" event={"ID":"c91cd3a2-2f89-4d78-b78f-09b6c2851f18","Type":"ContainerStarted","Data":"656097b70d3d165bc1b8375e0c704391da93c6bdb3f0be5900aae6c1632831de"} Oct 03 18:30:05 crc kubenswrapper[4835]: I1003 18:30:05.358713 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:30:05 crc kubenswrapper[4835]: I1003 18:30:05.359197 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:30:05 crc kubenswrapper[4835]: I1003 18:30:05.359267 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:30:05 crc kubenswrapper[4835]: I1003 18:30:05.359960 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fc96018384aa8860a4c2fcec8a03cef5fa41451e8751027f47a38b13cdf1722b"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 18:30:05 crc kubenswrapper[4835]: I1003 18:30:05.360028 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://fc96018384aa8860a4c2fcec8a03cef5fa41451e8751027f47a38b13cdf1722b" gracePeriod=600 Oct 03 18:30:05 crc kubenswrapper[4835]: I1003 18:30:05.499775 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="fc96018384aa8860a4c2fcec8a03cef5fa41451e8751027f47a38b13cdf1722b" exitCode=0 Oct 03 18:30:05 crc kubenswrapper[4835]: I1003 18:30:05.499819 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"fc96018384aa8860a4c2fcec8a03cef5fa41451e8751027f47a38b13cdf1722b"} Oct 03 18:30:05 crc kubenswrapper[4835]: I1003 18:30:05.499851 4835 scope.go:117] "RemoveContainer" containerID="6cbefddbf8736040316432cea35633f5b9fb0a39e77bcc8a41c22e4802ea88fe" Oct 03 18:30:06 crc kubenswrapper[4835]: I1003 18:30:06.508140 4835 generic.go:334] "Generic (PLEG): container finished" podID="c91cd3a2-2f89-4d78-b78f-09b6c2851f18" containerID="7e01af0729f0f491d3deed60f3a11e956efe3ac1486aef74a6796d66736f18e0" exitCode=0 Oct 03 18:30:06 crc kubenswrapper[4835]: I1003 18:30:06.508240 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" event={"ID":"c91cd3a2-2f89-4d78-b78f-09b6c2851f18","Type":"ContainerDied","Data":"7e01af0729f0f491d3deed60f3a11e956efe3ac1486aef74a6796d66736f18e0"} Oct 03 18:30:07 crc kubenswrapper[4835]: I1003 18:30:07.885664 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" Oct 03 18:30:07 crc kubenswrapper[4835]: I1003 18:30:07.966137 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2fld\" (UniqueName: \"kubernetes.io/projected/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-kube-api-access-s2fld\") pod \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\" (UID: \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\") " Oct 03 18:30:07 crc kubenswrapper[4835]: I1003 18:30:07.966236 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-config-volume\") pod \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\" (UID: \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\") " Oct 03 18:30:07 crc kubenswrapper[4835]: I1003 18:30:07.966276 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-secret-volume\") pod \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\" (UID: \"c91cd3a2-2f89-4d78-b78f-09b6c2851f18\") " Oct 03 18:30:07 crc kubenswrapper[4835]: I1003 18:30:07.966701 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-config-volume" (OuterVolumeSpecName: "config-volume") pod "c91cd3a2-2f89-4d78-b78f-09b6c2851f18" (UID: "c91cd3a2-2f89-4d78-b78f-09b6c2851f18"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:07 crc kubenswrapper[4835]: I1003 18:30:07.977900 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-kube-api-access-s2fld" (OuterVolumeSpecName: "kube-api-access-s2fld") pod "c91cd3a2-2f89-4d78-b78f-09b6c2851f18" (UID: "c91cd3a2-2f89-4d78-b78f-09b6c2851f18"). InnerVolumeSpecName "kube-api-access-s2fld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:30:07 crc kubenswrapper[4835]: I1003 18:30:07.979270 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c91cd3a2-2f89-4d78-b78f-09b6c2851f18" (UID: "c91cd3a2-2f89-4d78-b78f-09b6c2851f18"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:30:08 crc kubenswrapper[4835]: I1003 18:30:08.067895 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2fld\" (UniqueName: \"kubernetes.io/projected/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-kube-api-access-s2fld\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:08 crc kubenswrapper[4835]: I1003 18:30:08.068222 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:08 crc kubenswrapper[4835]: I1003 18:30:08.068391 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c91cd3a2-2f89-4d78-b78f-09b6c2851f18-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:08 crc kubenswrapper[4835]: I1003 18:30:08.522502 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" event={"ID":"c91cd3a2-2f89-4d78-b78f-09b6c2851f18","Type":"ContainerDied","Data":"656097b70d3d165bc1b8375e0c704391da93c6bdb3f0be5900aae6c1632831de"} Oct 03 18:30:08 crc kubenswrapper[4835]: I1003 18:30:08.522557 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr" Oct 03 18:30:08 crc kubenswrapper[4835]: I1003 18:30:08.522569 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="656097b70d3d165bc1b8375e0c704391da93c6bdb3f0be5900aae6c1632831de" Oct 03 18:30:10 crc kubenswrapper[4835]: I1003 18:30:10.536826 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"5001daf2345cdee7613b40d98138459acb007dbdcb955c73ad790b203e897d4f"} Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.128690 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fdd4c647c-cqjlp"] Oct 03 18:30:11 crc kubenswrapper[4835]: E1003 18:30:11.129311 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91cd3a2-2f89-4d78-b78f-09b6c2851f18" containerName="collect-profiles" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.129323 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91cd3a2-2f89-4d78-b78f-09b6c2851f18" containerName="collect-profiles" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.129460 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c91cd3a2-2f89-4d78-b78f-09b6c2851f18" containerName="collect-profiles" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.130305 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdd4c647c-cqjlp" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.131742 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-nbb26" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.133704 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.133845 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.134380 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.134615 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdd4c647c-cqjlp"] Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.169936 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66c59f6f4c-shtk2"] Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.171042 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.175357 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.192729 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66c59f6f4c-shtk2"] Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.208745 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94lrg\" (UniqueName: \"kubernetes.io/projected/5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb-kube-api-access-94lrg\") pod \"dnsmasq-dns-5fdd4c647c-cqjlp\" (UID: \"5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb\") " pod="openstack/dnsmasq-dns-5fdd4c647c-cqjlp" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.208793 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb-config\") pod \"dnsmasq-dns-5fdd4c647c-cqjlp\" (UID: \"5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb\") " pod="openstack/dnsmasq-dns-5fdd4c647c-cqjlp" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.208848 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b79636d7-e033-4e08-bb08-4cf8c0406522-dns-svc\") pod \"dnsmasq-dns-66c59f6f4c-shtk2\" (UID: \"b79636d7-e033-4e08-bb08-4cf8c0406522\") " pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.208868 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b79636d7-e033-4e08-bb08-4cf8c0406522-config\") pod \"dnsmasq-dns-66c59f6f4c-shtk2\" (UID: \"b79636d7-e033-4e08-bb08-4cf8c0406522\") " pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.209028 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qj7v\" (UniqueName: \"kubernetes.io/projected/b79636d7-e033-4e08-bb08-4cf8c0406522-kube-api-access-9qj7v\") pod \"dnsmasq-dns-66c59f6f4c-shtk2\" (UID: \"b79636d7-e033-4e08-bb08-4cf8c0406522\") " pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.310648 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94lrg\" (UniqueName: \"kubernetes.io/projected/5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb-kube-api-access-94lrg\") pod \"dnsmasq-dns-5fdd4c647c-cqjlp\" (UID: \"5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb\") " pod="openstack/dnsmasq-dns-5fdd4c647c-cqjlp" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.310701 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb-config\") pod \"dnsmasq-dns-5fdd4c647c-cqjlp\" (UID: \"5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb\") " pod="openstack/dnsmasq-dns-5fdd4c647c-cqjlp" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.310732 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b79636d7-e033-4e08-bb08-4cf8c0406522-dns-svc\") pod \"dnsmasq-dns-66c59f6f4c-shtk2\" (UID: \"b79636d7-e033-4e08-bb08-4cf8c0406522\") " pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.310751 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b79636d7-e033-4e08-bb08-4cf8c0406522-config\") pod \"dnsmasq-dns-66c59f6f4c-shtk2\" (UID: \"b79636d7-e033-4e08-bb08-4cf8c0406522\") " pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.310820 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qj7v\" (UniqueName: \"kubernetes.io/projected/b79636d7-e033-4e08-bb08-4cf8c0406522-kube-api-access-9qj7v\") pod \"dnsmasq-dns-66c59f6f4c-shtk2\" (UID: \"b79636d7-e033-4e08-bb08-4cf8c0406522\") " pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.311690 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b79636d7-e033-4e08-bb08-4cf8c0406522-config\") pod \"dnsmasq-dns-66c59f6f4c-shtk2\" (UID: \"b79636d7-e033-4e08-bb08-4cf8c0406522\") " pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.311744 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb-config\") pod \"dnsmasq-dns-5fdd4c647c-cqjlp\" (UID: \"5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb\") " pod="openstack/dnsmasq-dns-5fdd4c647c-cqjlp" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.311771 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b79636d7-e033-4e08-bb08-4cf8c0406522-dns-svc\") pod \"dnsmasq-dns-66c59f6f4c-shtk2\" (UID: \"b79636d7-e033-4e08-bb08-4cf8c0406522\") " pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.329475 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94lrg\" (UniqueName: \"kubernetes.io/projected/5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb-kube-api-access-94lrg\") pod \"dnsmasq-dns-5fdd4c647c-cqjlp\" (UID: \"5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb\") " pod="openstack/dnsmasq-dns-5fdd4c647c-cqjlp" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.329490 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qj7v\" (UniqueName: \"kubernetes.io/projected/b79636d7-e033-4e08-bb08-4cf8c0406522-kube-api-access-9qj7v\") pod \"dnsmasq-dns-66c59f6f4c-shtk2\" (UID: \"b79636d7-e033-4e08-bb08-4cf8c0406522\") " pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.456042 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdd4c647c-cqjlp" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.487871 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" Oct 03 18:30:11 crc kubenswrapper[4835]: I1003 18:30:11.970378 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66c59f6f4c-shtk2"] Oct 03 18:30:12 crc kubenswrapper[4835]: I1003 18:30:12.001574 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fdd4c647c-cqjlp"] Oct 03 18:30:12 crc kubenswrapper[4835]: W1003 18:30:12.007303 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e1ca2f3_b56e_40c2_9bcf_50dd4cb20cbb.slice/crio-deb7377f05dc2580b74ebe108db441b84eb2b3a5b2499cf6dd123224ed55062f WatchSource:0}: Error finding container deb7377f05dc2580b74ebe108db441b84eb2b3a5b2499cf6dd123224ed55062f: Status 404 returned error can't find the container with id deb7377f05dc2580b74ebe108db441b84eb2b3a5b2499cf6dd123224ed55062f Oct 03 18:30:12 crc kubenswrapper[4835]: I1003 18:30:12.554233 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdd4c647c-cqjlp" event={"ID":"5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb","Type":"ContainerStarted","Data":"deb7377f05dc2580b74ebe108db441b84eb2b3a5b2499cf6dd123224ed55062f"} Oct 03 18:30:12 crc kubenswrapper[4835]: I1003 18:30:12.556021 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" event={"ID":"b79636d7-e033-4e08-bb08-4cf8c0406522","Type":"ContainerStarted","Data":"024ebb5072f0bc7ca1451da5fc5f64839bb44a7eaa44951209dd911727762df6"} Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.009890 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdd4c647c-cqjlp"] Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.038875 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6759f6bdd7-59j4z"] Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.040039 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.050611 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6759f6bdd7-59j4z"] Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.073753 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87gzc\" (UniqueName: \"kubernetes.io/projected/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-kube-api-access-87gzc\") pod \"dnsmasq-dns-6759f6bdd7-59j4z\" (UID: \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\") " pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.073833 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-config\") pod \"dnsmasq-dns-6759f6bdd7-59j4z\" (UID: \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\") " pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.073871 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-dns-svc\") pod \"dnsmasq-dns-6759f6bdd7-59j4z\" (UID: \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\") " pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.175461 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87gzc\" (UniqueName: \"kubernetes.io/projected/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-kube-api-access-87gzc\") pod \"dnsmasq-dns-6759f6bdd7-59j4z\" (UID: \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\") " pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.175552 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-config\") pod \"dnsmasq-dns-6759f6bdd7-59j4z\" (UID: \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\") " pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.175618 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-dns-svc\") pod \"dnsmasq-dns-6759f6bdd7-59j4z\" (UID: \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\") " pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.176573 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-dns-svc\") pod \"dnsmasq-dns-6759f6bdd7-59j4z\" (UID: \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\") " pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.176722 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-config\") pod \"dnsmasq-dns-6759f6bdd7-59j4z\" (UID: \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\") " pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.198019 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87gzc\" (UniqueName: \"kubernetes.io/projected/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-kube-api-access-87gzc\") pod \"dnsmasq-dns-6759f6bdd7-59j4z\" (UID: \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\") " pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.303549 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66c59f6f4c-shtk2"] Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.322405 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84c5f764c9-jl5ht"] Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.323605 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.337081 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c5f764c9-jl5ht"] Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.373829 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.378174 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9757462c-6078-422a-9786-e82455a9bcd6-config\") pod \"dnsmasq-dns-84c5f764c9-jl5ht\" (UID: \"9757462c-6078-422a-9786-e82455a9bcd6\") " pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.378214 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bstm5\" (UniqueName: \"kubernetes.io/projected/9757462c-6078-422a-9786-e82455a9bcd6-kube-api-access-bstm5\") pod \"dnsmasq-dns-84c5f764c9-jl5ht\" (UID: \"9757462c-6078-422a-9786-e82455a9bcd6\") " pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.378236 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9757462c-6078-422a-9786-e82455a9bcd6-dns-svc\") pod \"dnsmasq-dns-84c5f764c9-jl5ht\" (UID: \"9757462c-6078-422a-9786-e82455a9bcd6\") " pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.479407 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9757462c-6078-422a-9786-e82455a9bcd6-config\") pod \"dnsmasq-dns-84c5f764c9-jl5ht\" (UID: \"9757462c-6078-422a-9786-e82455a9bcd6\") " pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.479459 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bstm5\" (UniqueName: \"kubernetes.io/projected/9757462c-6078-422a-9786-e82455a9bcd6-kube-api-access-bstm5\") pod \"dnsmasq-dns-84c5f764c9-jl5ht\" (UID: \"9757462c-6078-422a-9786-e82455a9bcd6\") " pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.479482 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9757462c-6078-422a-9786-e82455a9bcd6-dns-svc\") pod \"dnsmasq-dns-84c5f764c9-jl5ht\" (UID: \"9757462c-6078-422a-9786-e82455a9bcd6\") " pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.480598 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9757462c-6078-422a-9786-e82455a9bcd6-dns-svc\") pod \"dnsmasq-dns-84c5f764c9-jl5ht\" (UID: \"9757462c-6078-422a-9786-e82455a9bcd6\") " pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.482667 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9757462c-6078-422a-9786-e82455a9bcd6-config\") pod \"dnsmasq-dns-84c5f764c9-jl5ht\" (UID: \"9757462c-6078-422a-9786-e82455a9bcd6\") " pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.499953 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bstm5\" (UniqueName: \"kubernetes.io/projected/9757462c-6078-422a-9786-e82455a9bcd6-kube-api-access-bstm5\") pod \"dnsmasq-dns-84c5f764c9-jl5ht\" (UID: \"9757462c-6078-422a-9786-e82455a9bcd6\") " pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.597219 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6759f6bdd7-59j4z"] Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.618587 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6564457b49-fg4vg"] Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.619882 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.630351 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6564457b49-fg4vg"] Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.637973 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.682250 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba05d20e-0cd7-4c1a-bee5-45f439b42518-dns-svc\") pod \"dnsmasq-dns-6564457b49-fg4vg\" (UID: \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\") " pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.682311 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhjnb\" (UniqueName: \"kubernetes.io/projected/ba05d20e-0cd7-4c1a-bee5-45f439b42518-kube-api-access-xhjnb\") pod \"dnsmasq-dns-6564457b49-fg4vg\" (UID: \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\") " pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.682339 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba05d20e-0cd7-4c1a-bee5-45f439b42518-config\") pod \"dnsmasq-dns-6564457b49-fg4vg\" (UID: \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\") " pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.783532 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba05d20e-0cd7-4c1a-bee5-45f439b42518-dns-svc\") pod \"dnsmasq-dns-6564457b49-fg4vg\" (UID: \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\") " pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.783593 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhjnb\" (UniqueName: \"kubernetes.io/projected/ba05d20e-0cd7-4c1a-bee5-45f439b42518-kube-api-access-xhjnb\") pod \"dnsmasq-dns-6564457b49-fg4vg\" (UID: \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\") " pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.783623 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba05d20e-0cd7-4c1a-bee5-45f439b42518-config\") pod \"dnsmasq-dns-6564457b49-fg4vg\" (UID: \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\") " pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.784600 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba05d20e-0cd7-4c1a-bee5-45f439b42518-dns-svc\") pod \"dnsmasq-dns-6564457b49-fg4vg\" (UID: \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\") " pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.784658 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba05d20e-0cd7-4c1a-bee5-45f439b42518-config\") pod \"dnsmasq-dns-6564457b49-fg4vg\" (UID: \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\") " pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.817902 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhjnb\" (UniqueName: \"kubernetes.io/projected/ba05d20e-0cd7-4c1a-bee5-45f439b42518-kube-api-access-xhjnb\") pod \"dnsmasq-dns-6564457b49-fg4vg\" (UID: \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\") " pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:30:15 crc kubenswrapper[4835]: I1003 18:30:15.942746 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.194518 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.195630 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.199329 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.199340 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.199549 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.206926 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.207205 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.207625 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-4cvq6" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.210744 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.216180 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.290312 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.290358 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gtln\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-kube-api-access-7gtln\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.290381 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.290396 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-config-data\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.290420 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6fd26bdb-868b-49db-9698-e7c79eea5cef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.290444 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.290475 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6fd26bdb-868b-49db-9698-e7c79eea5cef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.290500 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.290518 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.290560 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.290610 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.391627 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.391688 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.391717 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gtln\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-kube-api-access-7gtln\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.391738 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.391754 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-config-data\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.391776 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6fd26bdb-868b-49db-9698-e7c79eea5cef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.391793 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.391815 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6fd26bdb-868b-49db-9698-e7c79eea5cef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.391844 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.391863 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.391905 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.392172 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.392489 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.393192 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-config-data\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.393430 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.395568 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.395717 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.400512 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.404756 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.407584 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6fd26bdb-868b-49db-9698-e7c79eea5cef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.407890 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gtln\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-kube-api-access-7gtln\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.410631 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6fd26bdb-868b-49db-9698-e7c79eea5cef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.415708 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.460359 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.467724 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.471045 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.471515 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.471676 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.471980 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.472745 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.473156 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.487148 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-gltwr" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.494296 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.515433 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.594567 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.594617 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.594653 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.594677 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq47v\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-kube-api-access-tq47v\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.594717 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.594731 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2f5f99aa-dba6-465b-866a-1e293ba51685-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.594752 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.594778 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2f5f99aa-dba6-465b-866a-1e293ba51685-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.594801 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.594823 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.594844 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.696100 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq47v\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-kube-api-access-tq47v\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.696177 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.696198 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2f5f99aa-dba6-465b-866a-1e293ba51685-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.696219 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.696246 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2f5f99aa-dba6-465b-866a-1e293ba51685-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.696271 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.696296 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.696317 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.696333 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.696355 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.696414 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.697226 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.698396 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.698688 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.699262 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.699482 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.700020 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.701876 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.711837 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2f5f99aa-dba6-465b-866a-1e293ba51685-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.719171 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.719998 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2f5f99aa-dba6-465b-866a-1e293ba51685-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.720294 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq47v\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-kube-api-access-tq47v\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.731641 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.733165 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.739849 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.742584 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.742837 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-ggj4f" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.743104 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.743588 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.743627 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.743696 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.743860 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.744024 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.802431 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.899242 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b17ce629-9abd-42ba-8004-cc4b85cee405-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.899291 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b17ce629-9abd-42ba-8004-cc4b85cee405-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.899317 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b17ce629-9abd-42ba-8004-cc4b85cee405-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.899427 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.899498 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b17ce629-9abd-42ba-8004-cc4b85cee405-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.899632 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b17ce629-9abd-42ba-8004-cc4b85cee405-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.899665 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b17ce629-9abd-42ba-8004-cc4b85cee405-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.899702 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b17ce629-9abd-42ba-8004-cc4b85cee405-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.899724 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v8jr\" (UniqueName: \"kubernetes.io/projected/b17ce629-9abd-42ba-8004-cc4b85cee405-kube-api-access-9v8jr\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.899748 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b17ce629-9abd-42ba-8004-cc4b85cee405-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:16 crc kubenswrapper[4835]: I1003 18:30:16.899802 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b17ce629-9abd-42ba-8004-cc4b85cee405-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.001922 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b17ce629-9abd-42ba-8004-cc4b85cee405-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.001999 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b17ce629-9abd-42ba-8004-cc4b85cee405-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.002058 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b17ce629-9abd-42ba-8004-cc4b85cee405-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.002111 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v8jr\" (UniqueName: \"kubernetes.io/projected/b17ce629-9abd-42ba-8004-cc4b85cee405-kube-api-access-9v8jr\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.002135 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b17ce629-9abd-42ba-8004-cc4b85cee405-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.002155 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b17ce629-9abd-42ba-8004-cc4b85cee405-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.002220 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b17ce629-9abd-42ba-8004-cc4b85cee405-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.002279 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b17ce629-9abd-42ba-8004-cc4b85cee405-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.002329 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b17ce629-9abd-42ba-8004-cc4b85cee405-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.002417 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.002608 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b17ce629-9abd-42ba-8004-cc4b85cee405-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.002781 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.003640 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b17ce629-9abd-42ba-8004-cc4b85cee405-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.003896 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b17ce629-9abd-42ba-8004-cc4b85cee405-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.003912 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b17ce629-9abd-42ba-8004-cc4b85cee405-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.004215 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b17ce629-9abd-42ba-8004-cc4b85cee405-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.004389 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b17ce629-9abd-42ba-8004-cc4b85cee405-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.005509 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b17ce629-9abd-42ba-8004-cc4b85cee405-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.006838 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b17ce629-9abd-42ba-8004-cc4b85cee405-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.008516 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b17ce629-9abd-42ba-8004-cc4b85cee405-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.008587 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b17ce629-9abd-42ba-8004-cc4b85cee405-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.020171 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v8jr\" (UniqueName: \"kubernetes.io/projected/b17ce629-9abd-42ba-8004-cc4b85cee405-kube-api-access-9v8jr\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.035968 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"b17ce629-9abd-42ba-8004-cc4b85cee405\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:17 crc kubenswrapper[4835]: I1003 18:30:17.080046 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.637161 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.640435 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.645828 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.646096 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-74v2p" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.646247 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.646714 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.649515 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.655262 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.655975 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.747533 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/71cb8688-6214-4e5e-a7da-051c5939df65-secrets\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.747619 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cb8688-6214-4e5e-a7da-051c5939df65-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.747644 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/71cb8688-6214-4e5e-a7da-051c5939df65-config-data-generated\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.747661 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkl8h\" (UniqueName: \"kubernetes.io/projected/71cb8688-6214-4e5e-a7da-051c5939df65-kube-api-access-kkl8h\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.747678 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71cb8688-6214-4e5e-a7da-051c5939df65-kolla-config\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.747698 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71cb8688-6214-4e5e-a7da-051c5939df65-operator-scripts\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.747726 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/71cb8688-6214-4e5e-a7da-051c5939df65-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.747755 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/71cb8688-6214-4e5e-a7da-051c5939df65-config-data-default\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.747787 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.849536 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/71cb8688-6214-4e5e-a7da-051c5939df65-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.849599 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/71cb8688-6214-4e5e-a7da-051c5939df65-config-data-default\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.849635 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.849690 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/71cb8688-6214-4e5e-a7da-051c5939df65-secrets\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.849741 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cb8688-6214-4e5e-a7da-051c5939df65-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.849759 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/71cb8688-6214-4e5e-a7da-051c5939df65-config-data-generated\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.849774 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkl8h\" (UniqueName: \"kubernetes.io/projected/71cb8688-6214-4e5e-a7da-051c5939df65-kube-api-access-kkl8h\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.849793 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71cb8688-6214-4e5e-a7da-051c5939df65-kolla-config\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.849829 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71cb8688-6214-4e5e-a7da-051c5939df65-operator-scripts\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.850372 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/71cb8688-6214-4e5e-a7da-051c5939df65-config-data-generated\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.850422 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.850929 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/71cb8688-6214-4e5e-a7da-051c5939df65-config-data-default\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.851167 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71cb8688-6214-4e5e-a7da-051c5939df65-operator-scripts\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.852421 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/71cb8688-6214-4e5e-a7da-051c5939df65-kolla-config\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.857778 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/71cb8688-6214-4e5e-a7da-051c5939df65-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.863430 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/71cb8688-6214-4e5e-a7da-051c5939df65-secrets\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.864837 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71cb8688-6214-4e5e-a7da-051c5939df65-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.865400 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkl8h\" (UniqueName: \"kubernetes.io/projected/71cb8688-6214-4e5e-a7da-051c5939df65-kube-api-access-kkl8h\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.882002 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"71cb8688-6214-4e5e-a7da-051c5939df65\") " pod="openstack/openstack-galera-0" Oct 03 18:30:19 crc kubenswrapper[4835]: I1003 18:30:19.968674 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.041385 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.042708 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.046397 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.046569 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.046830 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.046879 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-sb49n" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.056631 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.153243 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.156290 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.156346 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.156415 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.156449 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.156512 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.156574 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r7sl\" (UniqueName: \"kubernetes.io/projected/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-kube-api-access-4r7sl\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.156612 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.156728 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.258014 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.258092 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.258127 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.258172 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.258220 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.258251 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.258282 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.258344 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r7sl\" (UniqueName: \"kubernetes.io/projected/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-kube-api-access-4r7sl\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.258373 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.259256 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.259898 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.260146 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.260561 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.261119 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.265228 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.265817 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.265888 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.285596 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r7sl\" (UniqueName: \"kubernetes.io/projected/d91a9a1f-a39c-4a80-8bf4-1196bacc8870-kube-api-access-4r7sl\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.292674 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d91a9a1f-a39c-4a80-8bf4-1196bacc8870\") " pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.363110 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.434381 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.435329 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.437623 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.437762 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-d9g48" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.437992 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.450006 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.561937 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vl7m\" (UniqueName: \"kubernetes.io/projected/69b75d74-7a6f-40ff-9c5c-481ced22eec0-kube-api-access-5vl7m\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.561984 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69b75d74-7a6f-40ff-9c5c-481ced22eec0-config-data\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.562362 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69b75d74-7a6f-40ff-9c5c-481ced22eec0-kolla-config\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.562430 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b75d74-7a6f-40ff-9c5c-481ced22eec0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.562508 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b75d74-7a6f-40ff-9c5c-481ced22eec0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.664017 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vl7m\" (UniqueName: \"kubernetes.io/projected/69b75d74-7a6f-40ff-9c5c-481ced22eec0-kube-api-access-5vl7m\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.664097 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69b75d74-7a6f-40ff-9c5c-481ced22eec0-config-data\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.664210 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69b75d74-7a6f-40ff-9c5c-481ced22eec0-kolla-config\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.664238 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b75d74-7a6f-40ff-9c5c-481ced22eec0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.664267 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b75d74-7a6f-40ff-9c5c-481ced22eec0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.665387 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69b75d74-7a6f-40ff-9c5c-481ced22eec0-config-data\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.665603 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69b75d74-7a6f-40ff-9c5c-481ced22eec0-kolla-config\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.667906 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b75d74-7a6f-40ff-9c5c-481ced22eec0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.670233 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b75d74-7a6f-40ff-9c5c-481ced22eec0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.680644 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vl7m\" (UniqueName: \"kubernetes.io/projected/69b75d74-7a6f-40ff-9c5c-481ced22eec0-kube-api-access-5vl7m\") pod \"memcached-0\" (UID: \"69b75d74-7a6f-40ff-9c5c-481ced22eec0\") " pod="openstack/memcached-0" Oct 03 18:30:20 crc kubenswrapper[4835]: I1003 18:30:20.752728 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 18:30:22 crc kubenswrapper[4835]: I1003 18:30:22.045500 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 18:30:22 crc kubenswrapper[4835]: I1003 18:30:22.046760 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 18:30:22 crc kubenswrapper[4835]: I1003 18:30:22.049111 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-p65bd" Oct 03 18:30:22 crc kubenswrapper[4835]: I1003 18:30:22.057725 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 18:30:22 crc kubenswrapper[4835]: I1003 18:30:22.187692 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lcrg\" (UniqueName: \"kubernetes.io/projected/a53486c1-995b-46d5-84a0-f74f2ec2b5ba-kube-api-access-7lcrg\") pod \"kube-state-metrics-0\" (UID: \"a53486c1-995b-46d5-84a0-f74f2ec2b5ba\") " pod="openstack/kube-state-metrics-0" Oct 03 18:30:22 crc kubenswrapper[4835]: I1003 18:30:22.289387 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lcrg\" (UniqueName: \"kubernetes.io/projected/a53486c1-995b-46d5-84a0-f74f2ec2b5ba-kube-api-access-7lcrg\") pod \"kube-state-metrics-0\" (UID: \"a53486c1-995b-46d5-84a0-f74f2ec2b5ba\") " pod="openstack/kube-state-metrics-0" Oct 03 18:30:22 crc kubenswrapper[4835]: I1003 18:30:22.320405 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lcrg\" (UniqueName: \"kubernetes.io/projected/a53486c1-995b-46d5-84a0-f74f2ec2b5ba-kube-api-access-7lcrg\") pod \"kube-state-metrics-0\" (UID: \"a53486c1-995b-46d5-84a0-f74f2ec2b5ba\") " pod="openstack/kube-state-metrics-0" Oct 03 18:30:22 crc kubenswrapper[4835]: I1003 18:30:22.362858 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.399234 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.403504 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.405641 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.405953 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.406019 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.405959 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.406890 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-9kc6v" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.407436 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.413869 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.510723 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.510776 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.510799 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.510815 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.511143 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.511225 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2jbm\" (UniqueName: \"kubernetes.io/projected/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-kube-api-access-s2jbm\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.511273 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-config\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.511392 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.613321 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.613402 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.613433 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.613454 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.613523 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.613548 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2jbm\" (UniqueName: \"kubernetes.io/projected/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-kube-api-access-s2jbm\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.613573 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-config\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.614002 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.614798 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.617333 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.617699 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.618016 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.618051 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/88bcb9a338b078b35fe2aaa6fcd1ca51c30be9164778c602eed472976adc1b23/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.619357 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.626687 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-config\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.627235 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.630362 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2jbm\" (UniqueName: \"kubernetes.io/projected/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-kube-api-access-s2jbm\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.655810 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"prometheus-metric-storage-0\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:23 crc kubenswrapper[4835]: I1003 18:30:23.727440 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.541374 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-plxn4"] Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.542656 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.544540 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.544913 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.545130 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-sfwp7" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.557315 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-plxn4"] Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.567760 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-gkx58"] Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.569410 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.601893 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gkx58"] Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.656921 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-var-run\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.656964 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9280b95-ef96-4c58-948f-2abcd7ad8a25-var-log-ovn\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.656984 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ml4j\" (UniqueName: \"kubernetes.io/projected/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-kube-api-access-9ml4j\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.657011 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-var-log\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.657155 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9280b95-ef96-4c58-948f-2abcd7ad8a25-ovn-controller-tls-certs\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.657260 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9280b95-ef96-4c58-948f-2abcd7ad8a25-scripts\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.657316 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9280b95-ef96-4c58-948f-2abcd7ad8a25-var-run\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.657375 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9280b95-ef96-4c58-948f-2abcd7ad8a25-combined-ca-bundle\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.657512 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-var-lib\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.657552 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gqw\" (UniqueName: \"kubernetes.io/projected/c9280b95-ef96-4c58-948f-2abcd7ad8a25-kube-api-access-x4gqw\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.657600 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-scripts\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.657682 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-etc-ovs\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.657711 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9280b95-ef96-4c58-948f-2abcd7ad8a25-var-run-ovn\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.758649 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9280b95-ef96-4c58-948f-2abcd7ad8a25-var-run\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.758703 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9280b95-ef96-4c58-948f-2abcd7ad8a25-combined-ca-bundle\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.758758 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-var-lib\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.758778 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gqw\" (UniqueName: \"kubernetes.io/projected/c9280b95-ef96-4c58-948f-2abcd7ad8a25-kube-api-access-x4gqw\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.758805 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-scripts\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.758848 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-etc-ovs\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.758867 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9280b95-ef96-4c58-948f-2abcd7ad8a25-var-run-ovn\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.758882 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-var-run\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.758901 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9280b95-ef96-4c58-948f-2abcd7ad8a25-var-log-ovn\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.758916 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ml4j\" (UniqueName: \"kubernetes.io/projected/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-kube-api-access-9ml4j\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.758934 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-var-log\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.758948 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9280b95-ef96-4c58-948f-2abcd7ad8a25-ovn-controller-tls-certs\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.758967 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9280b95-ef96-4c58-948f-2abcd7ad8a25-scripts\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.760875 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9280b95-ef96-4c58-948f-2abcd7ad8a25-scripts\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.763609 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-var-lib\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.763790 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-var-run\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.764255 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-etc-ovs\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.764582 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9280b95-ef96-4c58-948f-2abcd7ad8a25-var-run\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.764712 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-var-log\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.764772 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9280b95-ef96-4c58-948f-2abcd7ad8a25-var-log-ovn\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.764840 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9280b95-ef96-4c58-948f-2abcd7ad8a25-var-run-ovn\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.770153 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9280b95-ef96-4c58-948f-2abcd7ad8a25-ovn-controller-tls-certs\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.775762 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9280b95-ef96-4c58-948f-2abcd7ad8a25-combined-ca-bundle\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.778884 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-scripts\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.781143 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ml4j\" (UniqueName: \"kubernetes.io/projected/6ad2fc66-9ffc-4229-a5d0-63c8239c8c69-kube-api-access-9ml4j\") pod \"ovn-controller-ovs-gkx58\" (UID: \"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69\") " pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.783989 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gqw\" (UniqueName: \"kubernetes.io/projected/c9280b95-ef96-4c58-948f-2abcd7ad8a25-kube-api-access-x4gqw\") pod \"ovn-controller-plxn4\" (UID: \"c9280b95-ef96-4c58-948f-2abcd7ad8a25\") " pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.860118 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plxn4" Oct 03 18:30:25 crc kubenswrapper[4835]: I1003 18:30:25.890176 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.538764 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.540436 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.543745 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.543868 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.543944 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-hm89t" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.543994 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.544158 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.562442 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.675885 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ba54d0-8893-4168-a725-993778708104-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.675959 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/90ba54d0-8893-4168-a725-993778708104-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.676001 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90ba54d0-8893-4168-a725-993778708104-config\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.676022 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ba54d0-8893-4168-a725-993778708104-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.676042 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90ba54d0-8893-4168-a725-993778708104-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.676092 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82qhg\" (UniqueName: \"kubernetes.io/projected/90ba54d0-8893-4168-a725-993778708104-kube-api-access-82qhg\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.676121 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.676142 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ba54d0-8893-4168-a725-993778708104-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.777431 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ba54d0-8893-4168-a725-993778708104-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.777823 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/90ba54d0-8893-4168-a725-993778708104-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.777862 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90ba54d0-8893-4168-a725-993778708104-config\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.777890 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ba54d0-8893-4168-a725-993778708104-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.777918 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90ba54d0-8893-4168-a725-993778708104-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.777961 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82qhg\" (UniqueName: \"kubernetes.io/projected/90ba54d0-8893-4168-a725-993778708104-kube-api-access-82qhg\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.777998 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.778026 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ba54d0-8893-4168-a725-993778708104-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.778233 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/90ba54d0-8893-4168-a725-993778708104-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.778787 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.779111 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90ba54d0-8893-4168-a725-993778708104-config\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.779188 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90ba54d0-8893-4168-a725-993778708104-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.782961 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ba54d0-8893-4168-a725-993778708104-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.783468 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90ba54d0-8893-4168-a725-993778708104-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.783887 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90ba54d0-8893-4168-a725-993778708104-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.796822 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82qhg\" (UniqueName: \"kubernetes.io/projected/90ba54d0-8893-4168-a725-993778708104-kube-api-access-82qhg\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.799193 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"90ba54d0-8893-4168-a725-993778708104\") " pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:26 crc kubenswrapper[4835]: I1003 18:30:26.858081 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:27 crc kubenswrapper[4835]: I1003 18:30:27.556034 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6564457b49-fg4vg"] Oct 03 18:30:28 crc kubenswrapper[4835]: W1003 18:30:28.042381 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba05d20e_0cd7_4c1a_bee5_45f439b42518.slice/crio-d2f8415eb3d32205846b4101855825f27fa43c43f90138bd0b87b84279d7efd5 WatchSource:0}: Error finding container d2f8415eb3d32205846b4101855825f27fa43c43f90138bd0b87b84279d7efd5: Status 404 returned error can't find the container with id d2f8415eb3d32205846b4101855825f27fa43c43f90138bd0b87b84279d7efd5 Oct 03 18:30:28 crc kubenswrapper[4835]: E1003 18:30:28.057385 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 03 18:30:28 crc kubenswrapper[4835]: E1003 18:30:28.057440 4835 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 03 18:30:28 crc kubenswrapper[4835]: E1003 18:30:28.057554 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.82:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9qj7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-66c59f6f4c-shtk2_openstack(b79636d7-e033-4e08-bb08-4cf8c0406522): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 18:30:28 crc kubenswrapper[4835]: E1003 18:30:28.058761 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" podUID="b79636d7-e033-4e08-bb08-4cf8c0406522" Oct 03 18:30:28 crc kubenswrapper[4835]: E1003 18:30:28.085376 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 03 18:30:28 crc kubenswrapper[4835]: E1003 18:30:28.085427 4835 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 03 18:30:28 crc kubenswrapper[4835]: E1003 18:30:28.085528 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.82:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-94lrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5fdd4c647c-cqjlp_openstack(5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 18:30:28 crc kubenswrapper[4835]: E1003 18:30:28.086963 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5fdd4c647c-cqjlp" podUID="5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb" Oct 03 18:30:28 crc kubenswrapper[4835]: I1003 18:30:28.693717 4835 generic.go:334] "Generic (PLEG): container finished" podID="ba05d20e-0cd7-4c1a-bee5-45f439b42518" containerID="cfaf8ddd76234769f32c6fbbc1b11ea5a0c3ab4eceb65c985e90b4146de76573" exitCode=0 Oct 03 18:30:28 crc kubenswrapper[4835]: I1003 18:30:28.693946 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6564457b49-fg4vg" event={"ID":"ba05d20e-0cd7-4c1a-bee5-45f439b42518","Type":"ContainerDied","Data":"cfaf8ddd76234769f32c6fbbc1b11ea5a0c3ab4eceb65c985e90b4146de76573"} Oct 03 18:30:28 crc kubenswrapper[4835]: I1003 18:30:28.694320 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6564457b49-fg4vg" event={"ID":"ba05d20e-0cd7-4c1a-bee5-45f439b42518","Type":"ContainerStarted","Data":"d2f8415eb3d32205846b4101855825f27fa43c43f90138bd0b87b84279d7efd5"} Oct 03 18:30:28 crc kubenswrapper[4835]: I1003 18:30:28.927440 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 03 18:30:28 crc kubenswrapper[4835]: W1003 18:30:28.933759 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb17ce629_9abd_42ba_8004_cc4b85cee405.slice/crio-da412f58012cd5327ebd304cfccdf347af97f5ce6ad69b2387e95dbe62c7273b WatchSource:0}: Error finding container da412f58012cd5327ebd304cfccdf347af97f5ce6ad69b2387e95dbe62c7273b: Status 404 returned error can't find the container with id da412f58012cd5327ebd304cfccdf347af97f5ce6ad69b2387e95dbe62c7273b Oct 03 18:30:28 crc kubenswrapper[4835]: I1003 18:30:28.959639 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c5f764c9-jl5ht"] Oct 03 18:30:28 crc kubenswrapper[4835]: I1003 18:30:28.977762 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 18:30:28 crc kubenswrapper[4835]: I1003 18:30:28.995849 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 18:30:29 crc kubenswrapper[4835]: W1003 18:30:29.018214 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69b75d74_7a6f_40ff_9c5c_481ced22eec0.slice/crio-1a4014601b109158765c56214814ab37326cd64b5545a785f11f73eed627beca WatchSource:0}: Error finding container 1a4014601b109158765c56214814ab37326cd64b5545a785f11f73eed627beca: Status 404 returned error can't find the container with id 1a4014601b109158765c56214814ab37326cd64b5545a785f11f73eed627beca Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.018908 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.239009 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gkx58"] Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.262801 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-kkp6t"] Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.264160 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.266778 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.274145 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kkp6t"] Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.336993 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-combined-ca-bundle\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.337086 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-ovn-rundir\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.337108 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t47gz\" (UniqueName: \"kubernetes.io/projected/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-kube-api-access-t47gz\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.337256 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-config\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.337486 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.337509 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-ovs-rundir\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.446607 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-combined-ca-bundle\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.446689 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-ovn-rundir\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.446710 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t47gz\" (UniqueName: \"kubernetes.io/projected/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-kube-api-access-t47gz\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.446742 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-config\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.446818 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.446835 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-ovs-rundir\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.447198 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-ovs-rundir\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.447249 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-ovn-rundir\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.448192 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-config\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.458871 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.459037 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-combined-ca-bundle\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.462982 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t47gz\" (UniqueName: \"kubernetes.io/projected/f94fd7e6-3253-4adc-a1f9-188598d9ed3b-kube-api-access-t47gz\") pod \"ovn-controller-metrics-kkp6t\" (UID: \"f94fd7e6-3253-4adc-a1f9-188598d9ed3b\") " pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.465675 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.470239 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdd4c647c-cqjlp" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.492649 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.547391 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qj7v\" (UniqueName: \"kubernetes.io/projected/b79636d7-e033-4e08-bb08-4cf8c0406522-kube-api-access-9qj7v\") pod \"b79636d7-e033-4e08-bb08-4cf8c0406522\" (UID: \"b79636d7-e033-4e08-bb08-4cf8c0406522\") " Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.547459 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b79636d7-e033-4e08-bb08-4cf8c0406522-dns-svc\") pod \"b79636d7-e033-4e08-bb08-4cf8c0406522\" (UID: \"b79636d7-e033-4e08-bb08-4cf8c0406522\") " Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.547542 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94lrg\" (UniqueName: \"kubernetes.io/projected/5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb-kube-api-access-94lrg\") pod \"5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb\" (UID: \"5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb\") " Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.547614 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b79636d7-e033-4e08-bb08-4cf8c0406522-config\") pod \"b79636d7-e033-4e08-bb08-4cf8c0406522\" (UID: \"b79636d7-e033-4e08-bb08-4cf8c0406522\") " Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.547682 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb-config\") pod \"5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb\" (UID: \"5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb\") " Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.548277 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb-config" (OuterVolumeSpecName: "config") pod "5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb" (UID: "5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.548283 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b79636d7-e033-4e08-bb08-4cf8c0406522-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b79636d7-e033-4e08-bb08-4cf8c0406522" (UID: "b79636d7-e033-4e08-bb08-4cf8c0406522"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.549327 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b79636d7-e033-4e08-bb08-4cf8c0406522-config" (OuterVolumeSpecName: "config") pod "b79636d7-e033-4e08-bb08-4cf8c0406522" (UID: "b79636d7-e033-4e08-bb08-4cf8c0406522"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.585626 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b79636d7-e033-4e08-bb08-4cf8c0406522-kube-api-access-9qj7v" (OuterVolumeSpecName: "kube-api-access-9qj7v") pod "b79636d7-e033-4e08-bb08-4cf8c0406522" (UID: "b79636d7-e033-4e08-bb08-4cf8c0406522"). InnerVolumeSpecName "kube-api-access-9qj7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.586258 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb-kube-api-access-94lrg" (OuterVolumeSpecName: "kube-api-access-94lrg") pod "5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb" (UID: "5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb"). InnerVolumeSpecName "kube-api-access-94lrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.624611 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.629941 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.637027 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.649703 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b79636d7-e033-4e08-bb08-4cf8c0406522-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.649732 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.649742 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qj7v\" (UniqueName: \"kubernetes.io/projected/b79636d7-e033-4e08-bb08-4cf8c0406522-kube-api-access-9qj7v\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.649752 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b79636d7-e033-4e08-bb08-4cf8c0406522-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.649761 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94lrg\" (UniqueName: \"kubernetes.io/projected/5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb-kube-api-access-94lrg\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.651198 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6759f6bdd7-59j4z"] Oct 03 18:30:29 crc kubenswrapper[4835]: W1003 18:30:29.688091 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda53486c1_995b_46d5_84a0_f74f2ec2b5ba.slice/crio-64418d5ba5174c55f5beeadfef4e07dba66dd856fd8c251e15fb9329560b6575 WatchSource:0}: Error finding container 64418d5ba5174c55f5beeadfef4e07dba66dd856fd8c251e15fb9329560b6575: Status 404 returned error can't find the container with id 64418d5ba5174c55f5beeadfef4e07dba66dd856fd8c251e15fb9329560b6575 Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.691106 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kkp6t" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.707725 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.707726 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c59f6f4c-shtk2" event={"ID":"b79636d7-e033-4e08-bb08-4cf8c0406522","Type":"ContainerDied","Data":"024ebb5072f0bc7ca1451da5fc5f64839bb44a7eaa44951209dd911727762df6"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.711790 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" event={"ID":"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2","Type":"ContainerStarted","Data":"1715d742155004e973ea361571bedfca35c148290a00bc4fe989a3c4f01852ba"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.713659 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2f5f99aa-dba6-465b-866a-1e293ba51685","Type":"ContainerStarted","Data":"4d745c208d96515f2f2b4d897bbf4fa83a87f8e897ea2c8b2133d9c30f4e826d"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.714765 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"69b75d74-7a6f-40ff-9c5c-481ced22eec0","Type":"ContainerStarted","Data":"1a4014601b109158765c56214814ab37326cd64b5545a785f11f73eed627beca"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.716229 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fdd4c647c-cqjlp" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.716216 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fdd4c647c-cqjlp" event={"ID":"5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb","Type":"ContainerDied","Data":"deb7377f05dc2580b74ebe108db441b84eb2b3a5b2499cf6dd123224ed55062f"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.718044 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gkx58" event={"ID":"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69","Type":"ContainerStarted","Data":"14b190a019e86a1842893a84e73266efffbb7c987e6f8569f7cf542dede21b70"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.720585 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d91a9a1f-a39c-4a80-8bf4-1196bacc8870","Type":"ContainerStarted","Data":"4f8ae4d5b6fdf8c7ef86e3401681a500a7fc30dd069cec5d02609eb29984ceef"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.722265 4835 generic.go:334] "Generic (PLEG): container finished" podID="9757462c-6078-422a-9786-e82455a9bcd6" containerID="cbd15d9b8ab364bacb57aa251d9648682c6b8638154001c1634067a887fb63c6" exitCode=0 Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.722534 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" event={"ID":"9757462c-6078-422a-9786-e82455a9bcd6","Type":"ContainerDied","Data":"cbd15d9b8ab364bacb57aa251d9648682c6b8638154001c1634067a887fb63c6"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.722673 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" event={"ID":"9757462c-6078-422a-9786-e82455a9bcd6","Type":"ContainerStarted","Data":"4585807acf2ee943fc0253b59065eb5ae88a11dc635045d4e2dfca103119ea56"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.724731 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"b17ce629-9abd-42ba-8004-cc4b85cee405","Type":"ContainerStarted","Data":"da412f58012cd5327ebd304cfccdf347af97f5ce6ad69b2387e95dbe62c7273b"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.726988 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1c6c5b2c-c368-4237-83ec-18ae3d06fe61","Type":"ContainerStarted","Data":"b798453bcd08208fc60fcfb773c694f424592fedd19f0134ed4d271d4ea56a9a"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.732398 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6564457b49-fg4vg" event={"ID":"ba05d20e-0cd7-4c1a-bee5-45f439b42518","Type":"ContainerStarted","Data":"2fff0a00a2a2536ac8123420f6a9a2c9c3134e387db28acdc79587c567deaf56"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.732566 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.734677 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a53486c1-995b-46d5-84a0-f74f2ec2b5ba","Type":"ContainerStarted","Data":"64418d5ba5174c55f5beeadfef4e07dba66dd856fd8c251e15fb9329560b6575"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.735787 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"71cb8688-6214-4e5e-a7da-051c5939df65","Type":"ContainerStarted","Data":"ae60f3837e543b38d7b95300b67f75480e9db861495f430c19c629c708cfab99"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.754736 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6fd26bdb-868b-49db-9698-e7c79eea5cef","Type":"ContainerStarted","Data":"129989100f23d84c2dd11a0337445a595bf588f441fd488492ba9b05d193a5fd"} Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.782709 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66c59f6f4c-shtk2"] Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.792537 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66c59f6f4c-shtk2"] Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.805115 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6564457b49-fg4vg" podStartSLOduration=14.661102426 podStartE2EDuration="14.805089233s" podCreationTimestamp="2025-10-03 18:30:15 +0000 UTC" firstStartedPulling="2025-10-03 18:30:28.047016881 +0000 UTC m=+969.762957753" lastFinishedPulling="2025-10-03 18:30:28.191003688 +0000 UTC m=+969.906944560" observedRunningTime="2025-10-03 18:30:29.794621523 +0000 UTC m=+971.510562415" watchObservedRunningTime="2025-10-03 18:30:29.805089233 +0000 UTC m=+971.521030115" Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.839389 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fdd4c647c-cqjlp"] Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.857574 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fdd4c647c-cqjlp"] Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.863472 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-plxn4"] Oct 03 18:30:29 crc kubenswrapper[4835]: W1003 18:30:29.891893 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9280b95_ef96_4c58_948f_2abcd7ad8a25.slice/crio-899373b5c86937e10ced8d63218bdda737c7a33ac522363f88245b3916a720b1 WatchSource:0}: Error finding container 899373b5c86937e10ced8d63218bdda737c7a33ac522363f88245b3916a720b1: Status 404 returned error can't find the container with id 899373b5c86937e10ced8d63218bdda737c7a33ac522363f88245b3916a720b1 Oct 03 18:30:29 crc kubenswrapper[4835]: I1003 18:30:29.941350 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.179745 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.181386 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.188239 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.188700 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.189527 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-44j4c" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.189562 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.227136 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.267250 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d5b41f3-fbdf-4663-af12-1f55f598de56-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.267579 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d5b41f3-fbdf-4663-af12-1f55f598de56-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.267711 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5b41f3-fbdf-4663-af12-1f55f598de56-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.267822 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4427\" (UniqueName: \"kubernetes.io/projected/2d5b41f3-fbdf-4663-af12-1f55f598de56-kube-api-access-f4427\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.267927 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5b41f3-fbdf-4663-af12-1f55f598de56-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.268035 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5b41f3-fbdf-4663-af12-1f55f598de56-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.268164 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.268309 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d5b41f3-fbdf-4663-af12-1f55f598de56-config\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: E1003 18:30:30.324444 4835 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 03 18:30:30 crc kubenswrapper[4835]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/9757462c-6078-422a-9786-e82455a9bcd6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 18:30:30 crc kubenswrapper[4835]: > podSandboxID="4585807acf2ee943fc0253b59065eb5ae88a11dc635045d4e2dfca103119ea56" Oct 03 18:30:30 crc kubenswrapper[4835]: E1003 18:30:30.324624 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 03 18:30:30 crc kubenswrapper[4835]: container &Container{Name:dnsmasq-dns,Image:38.102.83.82:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bstm5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-84c5f764c9-jl5ht_openstack(9757462c-6078-422a-9786-e82455a9bcd6): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/9757462c-6078-422a-9786-e82455a9bcd6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 18:30:30 crc kubenswrapper[4835]: > logger="UnhandledError" Oct 03 18:30:30 crc kubenswrapper[4835]: E1003 18:30:30.325740 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/9757462c-6078-422a-9786-e82455a9bcd6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" podUID="9757462c-6078-422a-9786-e82455a9bcd6" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.360759 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kkp6t"] Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.369424 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d5b41f3-fbdf-4663-af12-1f55f598de56-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.369497 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d5b41f3-fbdf-4663-af12-1f55f598de56-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.369525 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5b41f3-fbdf-4663-af12-1f55f598de56-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.369541 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4427\" (UniqueName: \"kubernetes.io/projected/2d5b41f3-fbdf-4663-af12-1f55f598de56-kube-api-access-f4427\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.369563 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5b41f3-fbdf-4663-af12-1f55f598de56-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.369584 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5b41f3-fbdf-4663-af12-1f55f598de56-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.369611 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.369635 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d5b41f3-fbdf-4663-af12-1f55f598de56-config\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.370563 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d5b41f3-fbdf-4663-af12-1f55f598de56-config\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.370809 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d5b41f3-fbdf-4663-af12-1f55f598de56-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.372396 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.372430 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d5b41f3-fbdf-4663-af12-1f55f598de56-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.384650 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5b41f3-fbdf-4663-af12-1f55f598de56-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.387894 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d5b41f3-fbdf-4663-af12-1f55f598de56-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.393098 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5b41f3-fbdf-4663-af12-1f55f598de56-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.405854 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4427\" (UniqueName: \"kubernetes.io/projected/2d5b41f3-fbdf-4663-af12-1f55f598de56-kube-api-access-f4427\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.428663 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d5b41f3-fbdf-4663-af12-1f55f598de56\") " pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.507132 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.771360 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plxn4" event={"ID":"c9280b95-ef96-4c58-948f-2abcd7ad8a25","Type":"ContainerStarted","Data":"899373b5c86937e10ced8d63218bdda737c7a33ac522363f88245b3916a720b1"} Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.775410 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"90ba54d0-8893-4168-a725-993778708104","Type":"ContainerStarted","Data":"3f148032b272a0bc16261711ccb02242c7ce42d6a7f588988cfef9d60106bbe2"} Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.777964 4835 generic.go:334] "Generic (PLEG): container finished" podID="c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2" containerID="eda8998b5826cfb7c12b97ada49ec16c14530444296e1c0338011825703c61b5" exitCode=0 Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.778131 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" event={"ID":"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2","Type":"ContainerDied","Data":"eda8998b5826cfb7c12b97ada49ec16c14530444296e1c0338011825703c61b5"} Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.786181 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kkp6t" event={"ID":"f94fd7e6-3253-4adc-a1f9-188598d9ed3b","Type":"ContainerStarted","Data":"94cd06fb1f9129ab6b49d234bf71b1b4b1492f3cf219351039740a08039250e5"} Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.897798 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb" path="/var/lib/kubelet/pods/5e1ca2f3-b56e-40c2-9bcf-50dd4cb20cbb/volumes" Oct 03 18:30:30 crc kubenswrapper[4835]: I1003 18:30:30.898930 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b79636d7-e033-4e08-bb08-4cf8c0406522" path="/var/lib/kubelet/pods/b79636d7-e033-4e08-bb08-4cf8c0406522/volumes" Oct 03 18:30:31 crc kubenswrapper[4835]: I1003 18:30:31.742005 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.150396 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.203420 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-dns-svc\") pod \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\" (UID: \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\") " Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.203562 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87gzc\" (UniqueName: \"kubernetes.io/projected/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-kube-api-access-87gzc\") pod \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\" (UID: \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\") " Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.203633 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-config\") pod \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\" (UID: \"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2\") " Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.219688 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-kube-api-access-87gzc" (OuterVolumeSpecName: "kube-api-access-87gzc") pod "c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2" (UID: "c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2"). InnerVolumeSpecName "kube-api-access-87gzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.222276 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-config" (OuterVolumeSpecName: "config") pod "c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2" (UID: "c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.223028 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2" (UID: "c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.305292 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.305326 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87gzc\" (UniqueName: \"kubernetes.io/projected/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-kube-api-access-87gzc\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.305338 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.813940 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" event={"ID":"c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2","Type":"ContainerDied","Data":"1715d742155004e973ea361571bedfca35c148290a00bc4fe989a3c4f01852ba"} Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.813990 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6759f6bdd7-59j4z" Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.814012 4835 scope.go:117] "RemoveContainer" containerID="eda8998b5826cfb7c12b97ada49ec16c14530444296e1c0338011825703c61b5" Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.862104 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6759f6bdd7-59j4z"] Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.869283 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6759f6bdd7-59j4z"] Oct 03 18:30:32 crc kubenswrapper[4835]: I1003 18:30:32.887926 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2" path="/var/lib/kubelet/pods/c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2/volumes" Oct 03 18:30:33 crc kubenswrapper[4835]: W1003 18:30:33.217231 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d5b41f3_fbdf_4663_af12_1f55f598de56.slice/crio-5d961c18b7279006119a51bb8bb0a196fb8c67f0b0ad19636429202d9e57169c WatchSource:0}: Error finding container 5d961c18b7279006119a51bb8bb0a196fb8c67f0b0ad19636429202d9e57169c: Status 404 returned error can't find the container with id 5d961c18b7279006119a51bb8bb0a196fb8c67f0b0ad19636429202d9e57169c Oct 03 18:30:33 crc kubenswrapper[4835]: I1003 18:30:33.821619 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2d5b41f3-fbdf-4663-af12-1f55f598de56","Type":"ContainerStarted","Data":"5d961c18b7279006119a51bb8bb0a196fb8c67f0b0ad19636429202d9e57169c"} Oct 03 18:30:35 crc kubenswrapper[4835]: I1003 18:30:35.944264 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:30:35 crc kubenswrapper[4835]: I1003 18:30:35.993930 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c5f764c9-jl5ht"] Oct 03 18:30:41 crc kubenswrapper[4835]: I1003 18:30:41.888759 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" podUID="9757462c-6078-422a-9786-e82455a9bcd6" containerName="dnsmasq-dns" containerID="cri-o://51748f179b2bef6d00f4c1690b1e9fb683d65fd11301aeb6468e468600d79a93" gracePeriod=10 Oct 03 18:30:41 crc kubenswrapper[4835]: I1003 18:30:41.889206 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" event={"ID":"9757462c-6078-422a-9786-e82455a9bcd6","Type":"ContainerStarted","Data":"51748f179b2bef6d00f4c1690b1e9fb683d65fd11301aeb6468e468600d79a93"} Oct 03 18:30:41 crc kubenswrapper[4835]: I1003 18:30:41.889374 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:41 crc kubenswrapper[4835]: I1003 18:30:41.891455 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2d5b41f3-fbdf-4663-af12-1f55f598de56","Type":"ContainerStarted","Data":"6d584e3310ddcb6392a1bb5ea703acb46fc0f35ec838be97212fee04ad945e5f"} Oct 03 18:30:41 crc kubenswrapper[4835]: I1003 18:30:41.894203 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d91a9a1f-a39c-4a80-8bf4-1196bacc8870","Type":"ContainerStarted","Data":"13f976e8f4c650af2a7d50e9ca49607e7aaf618c495a2f98ea3e3c4f46d565e8"} Oct 03 18:30:41 crc kubenswrapper[4835]: I1003 18:30:41.895883 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"69b75d74-7a6f-40ff-9c5c-481ced22eec0","Type":"ContainerStarted","Data":"9d83f25d71584f2d5a22f46e2209b0e0fa3d17df5594bb6f538a65c1e99439cb"} Oct 03 18:30:41 crc kubenswrapper[4835]: I1003 18:30:41.896089 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 03 18:30:41 crc kubenswrapper[4835]: I1003 18:30:41.908830 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" podStartSLOduration=26.908810994 podStartE2EDuration="26.908810994s" podCreationTimestamp="2025-10-03 18:30:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:30:41.904453647 +0000 UTC m=+983.620394519" watchObservedRunningTime="2025-10-03 18:30:41.908810994 +0000 UTC m=+983.624751866" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.501763 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.527203 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.731824277 podStartE2EDuration="22.527182877s" podCreationTimestamp="2025-10-03 18:30:20 +0000 UTC" firstStartedPulling="2025-10-03 18:30:29.044310526 +0000 UTC m=+970.760251398" lastFinishedPulling="2025-10-03 18:30:38.839669126 +0000 UTC m=+980.555609998" observedRunningTime="2025-10-03 18:30:41.957579673 +0000 UTC m=+983.673520545" watchObservedRunningTime="2025-10-03 18:30:42.527182877 +0000 UTC m=+984.243123749" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.602309 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9757462c-6078-422a-9786-e82455a9bcd6-config\") pod \"9757462c-6078-422a-9786-e82455a9bcd6\" (UID: \"9757462c-6078-422a-9786-e82455a9bcd6\") " Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.602442 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bstm5\" (UniqueName: \"kubernetes.io/projected/9757462c-6078-422a-9786-e82455a9bcd6-kube-api-access-bstm5\") pod \"9757462c-6078-422a-9786-e82455a9bcd6\" (UID: \"9757462c-6078-422a-9786-e82455a9bcd6\") " Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.602522 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9757462c-6078-422a-9786-e82455a9bcd6-dns-svc\") pod \"9757462c-6078-422a-9786-e82455a9bcd6\" (UID: \"9757462c-6078-422a-9786-e82455a9bcd6\") " Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.621872 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9757462c-6078-422a-9786-e82455a9bcd6-kube-api-access-bstm5" (OuterVolumeSpecName: "kube-api-access-bstm5") pod "9757462c-6078-422a-9786-e82455a9bcd6" (UID: "9757462c-6078-422a-9786-e82455a9bcd6"). InnerVolumeSpecName "kube-api-access-bstm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.703797 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bstm5\" (UniqueName: \"kubernetes.io/projected/9757462c-6078-422a-9786-e82455a9bcd6-kube-api-access-bstm5\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.903284 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gkx58" event={"ID":"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69","Type":"ContainerStarted","Data":"b9d82030ecfb6cd742b3a4a84cf66ccb34840dc1a73a88e161cc82b1a84a099c"} Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.905481 4835 generic.go:334] "Generic (PLEG): container finished" podID="9757462c-6078-422a-9786-e82455a9bcd6" containerID="51748f179b2bef6d00f4c1690b1e9fb683d65fd11301aeb6468e468600d79a93" exitCode=0 Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.905552 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.905555 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" event={"ID":"9757462c-6078-422a-9786-e82455a9bcd6","Type":"ContainerDied","Data":"51748f179b2bef6d00f4c1690b1e9fb683d65fd11301aeb6468e468600d79a93"} Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.905659 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c5f764c9-jl5ht" event={"ID":"9757462c-6078-422a-9786-e82455a9bcd6","Type":"ContainerDied","Data":"4585807acf2ee943fc0253b59065eb5ae88a11dc635045d4e2dfca103119ea56"} Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.905677 4835 scope.go:117] "RemoveContainer" containerID="51748f179b2bef6d00f4c1690b1e9fb683d65fd11301aeb6468e468600d79a93" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.907895 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"90ba54d0-8893-4168-a725-993778708104","Type":"ContainerStarted","Data":"1878c02162c1d6ae7e3a97b9131c86c51eb208eb0e24179794df7658e3a9a39b"} Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.909851 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a53486c1-995b-46d5-84a0-f74f2ec2b5ba","Type":"ContainerStarted","Data":"13d37a7aa6826d5448b04ad3aaf523ded57f96bb2c465815ee8aa2ea884063e2"} Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.909999 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.911169 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"71cb8688-6214-4e5e-a7da-051c5939df65","Type":"ContainerStarted","Data":"702e58af99e2d7418514cc723a5ef6acb1b6115a91ee87e62a4b5727a7a37205"} Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.914137 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2d5b41f3-fbdf-4663-af12-1f55f598de56","Type":"ContainerStarted","Data":"37c2596ecb7b8736711702e7eb3799d3ccb958bca9423e4b1fb59a4baae04285"} Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.917840 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kkp6t" event={"ID":"f94fd7e6-3253-4adc-a1f9-188598d9ed3b","Type":"ContainerStarted","Data":"81571f5fc3a86b6d6018fff6bbd5355eac12ece9720c973d578e162ef9665052"} Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.927838 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plxn4" event={"ID":"c9280b95-ef96-4c58-948f-2abcd7ad8a25","Type":"ContainerStarted","Data":"6bbdef1c686bddc9395dd9965466f719836ab0371d36f77736fb90ad447ec4dd"} Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.928522 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-plxn4" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.935473 4835 scope.go:117] "RemoveContainer" containerID="cbd15d9b8ab364bacb57aa251d9648682c6b8638154001c1634067a887fb63c6" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.945239 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-kkp6t" podStartSLOduration=3.991191062 podStartE2EDuration="13.945224824s" podCreationTimestamp="2025-10-03 18:30:29 +0000 UTC" firstStartedPulling="2025-10-03 18:30:30.623449155 +0000 UTC m=+972.339390027" lastFinishedPulling="2025-10-03 18:30:40.577482917 +0000 UTC m=+982.293423789" observedRunningTime="2025-10-03 18:30:42.942717112 +0000 UTC m=+984.658657994" watchObservedRunningTime="2025-10-03 18:30:42.945224824 +0000 UTC m=+984.661165696" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.958904 4835 scope.go:117] "RemoveContainer" containerID="51748f179b2bef6d00f4c1690b1e9fb683d65fd11301aeb6468e468600d79a93" Oct 03 18:30:42 crc kubenswrapper[4835]: E1003 18:30:42.959440 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51748f179b2bef6d00f4c1690b1e9fb683d65fd11301aeb6468e468600d79a93\": container with ID starting with 51748f179b2bef6d00f4c1690b1e9fb683d65fd11301aeb6468e468600d79a93 not found: ID does not exist" containerID="51748f179b2bef6d00f4c1690b1e9fb683d65fd11301aeb6468e468600d79a93" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.959479 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51748f179b2bef6d00f4c1690b1e9fb683d65fd11301aeb6468e468600d79a93"} err="failed to get container status \"51748f179b2bef6d00f4c1690b1e9fb683d65fd11301aeb6468e468600d79a93\": rpc error: code = NotFound desc = could not find container \"51748f179b2bef6d00f4c1690b1e9fb683d65fd11301aeb6468e468600d79a93\": container with ID starting with 51748f179b2bef6d00f4c1690b1e9fb683d65fd11301aeb6468e468600d79a93 not found: ID does not exist" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.959503 4835 scope.go:117] "RemoveContainer" containerID="cbd15d9b8ab364bacb57aa251d9648682c6b8638154001c1634067a887fb63c6" Oct 03 18:30:42 crc kubenswrapper[4835]: E1003 18:30:42.959737 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbd15d9b8ab364bacb57aa251d9648682c6b8638154001c1634067a887fb63c6\": container with ID starting with cbd15d9b8ab364bacb57aa251d9648682c6b8638154001c1634067a887fb63c6 not found: ID does not exist" containerID="cbd15d9b8ab364bacb57aa251d9648682c6b8638154001c1634067a887fb63c6" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.959761 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbd15d9b8ab364bacb57aa251d9648682c6b8638154001c1634067a887fb63c6"} err="failed to get container status \"cbd15d9b8ab364bacb57aa251d9648682c6b8638154001c1634067a887fb63c6\": rpc error: code = NotFound desc = could not find container \"cbd15d9b8ab364bacb57aa251d9648682c6b8638154001c1634067a887fb63c6\": container with ID starting with cbd15d9b8ab364bacb57aa251d9648682c6b8638154001c1634067a887fb63c6 not found: ID does not exist" Oct 03 18:30:42 crc kubenswrapper[4835]: I1003 18:30:42.978883 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.234527747 podStartE2EDuration="13.978864981s" podCreationTimestamp="2025-10-03 18:30:29 +0000 UTC" firstStartedPulling="2025-10-03 18:30:33.246284346 +0000 UTC m=+974.962225218" lastFinishedPulling="2025-10-03 18:30:39.99062159 +0000 UTC m=+981.706562452" observedRunningTime="2025-10-03 18:30:42.971726215 +0000 UTC m=+984.687667087" watchObservedRunningTime="2025-10-03 18:30:42.978864981 +0000 UTC m=+984.694805853" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.028029 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.050532163 podStartE2EDuration="21.028010339s" podCreationTimestamp="2025-10-03 18:30:22 +0000 UTC" firstStartedPulling="2025-10-03 18:30:29.694740459 +0000 UTC m=+971.410681331" lastFinishedPulling="2025-10-03 18:30:40.672218625 +0000 UTC m=+982.388159507" observedRunningTime="2025-10-03 18:30:43.017049019 +0000 UTC m=+984.732989891" watchObservedRunningTime="2025-10-03 18:30:43.028010339 +0000 UTC m=+984.743951211" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.061122 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-plxn4" podStartSLOduration=7.450298428 podStartE2EDuration="18.061099451s" podCreationTimestamp="2025-10-03 18:30:25 +0000 UTC" firstStartedPulling="2025-10-03 18:30:29.903976212 +0000 UTC m=+971.619917084" lastFinishedPulling="2025-10-03 18:30:40.514777245 +0000 UTC m=+982.230718107" observedRunningTime="2025-10-03 18:30:43.048624105 +0000 UTC m=+984.764564997" watchObservedRunningTime="2025-10-03 18:30:43.061099451 +0000 UTC m=+984.777040343" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.176883 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9757462c-6078-422a-9786-e82455a9bcd6-config" (OuterVolumeSpecName: "config") pod "9757462c-6078-422a-9786-e82455a9bcd6" (UID: "9757462c-6078-422a-9786-e82455a9bcd6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.211713 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9757462c-6078-422a-9786-e82455a9bcd6-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.223789 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7d48d677-h7jw9"] Oct 03 18:30:43 crc kubenswrapper[4835]: E1003 18:30:43.224434 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9757462c-6078-422a-9786-e82455a9bcd6" containerName="init" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.224556 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9757462c-6078-422a-9786-e82455a9bcd6" containerName="init" Oct 03 18:30:43 crc kubenswrapper[4835]: E1003 18:30:43.224652 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2" containerName="init" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.224725 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2" containerName="init" Oct 03 18:30:43 crc kubenswrapper[4835]: E1003 18:30:43.224818 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9757462c-6078-422a-9786-e82455a9bcd6" containerName="dnsmasq-dns" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.224887 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9757462c-6078-422a-9786-e82455a9bcd6" containerName="dnsmasq-dns" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.225209 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9757462c-6078-422a-9786-e82455a9bcd6" containerName="dnsmasq-dns" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.225277 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a54a7e-99ca-476f-a4bd-bec7c33cc7b2" containerName="init" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.231474 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.233600 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7d48d677-h7jw9"] Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.233908 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.314154 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s8fq\" (UniqueName: \"kubernetes.io/projected/08ce9add-21a6-4c22-acce-436481ff3eda-kube-api-access-9s8fq\") pod \"dnsmasq-dns-5b7d48d677-h7jw9\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.314277 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-ovsdbserver-nb\") pod \"dnsmasq-dns-5b7d48d677-h7jw9\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.314370 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-config\") pod \"dnsmasq-dns-5b7d48d677-h7jw9\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.314616 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-dns-svc\") pod \"dnsmasq-dns-5b7d48d677-h7jw9\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.401251 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7d48d677-h7jw9"] Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.409787 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55b8767675-ph6rj"] Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.411550 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.416737 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-ovsdbserver-nb\") pod \"dnsmasq-dns-5b7d48d677-h7jw9\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.416819 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-config\") pod \"dnsmasq-dns-5b7d48d677-h7jw9\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.416860 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-dns-svc\") pod \"dnsmasq-dns-5b7d48d677-h7jw9\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.416898 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s8fq\" (UniqueName: \"kubernetes.io/projected/08ce9add-21a6-4c22-acce-436481ff3eda-kube-api-access-9s8fq\") pod \"dnsmasq-dns-5b7d48d677-h7jw9\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.419368 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.420096 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-dns-svc\") pod \"dnsmasq-dns-5b7d48d677-h7jw9\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.420286 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-ovsdbserver-nb\") pod \"dnsmasq-dns-5b7d48d677-h7jw9\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.429606 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9757462c-6078-422a-9786-e82455a9bcd6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9757462c-6078-422a-9786-e82455a9bcd6" (UID: "9757462c-6078-422a-9786-e82455a9bcd6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.430258 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-config\") pod \"dnsmasq-dns-5b7d48d677-h7jw9\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.440833 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b8767675-ph6rj"] Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.459896 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s8fq\" (UniqueName: \"kubernetes.io/projected/08ce9add-21a6-4c22-acce-436481ff3eda-kube-api-access-9s8fq\") pod \"dnsmasq-dns-5b7d48d677-h7jw9\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.500838 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.518562 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-ovsdbserver-sb\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.518613 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-ovsdbserver-nb\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.518662 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-config\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.518685 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnnl7\" (UniqueName: \"kubernetes.io/projected/783d16b4-0ea1-451a-bedd-2f41f406b1ea-kube-api-access-xnnl7\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.518752 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-dns-svc\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.518795 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9757462c-6078-422a-9786-e82455a9bcd6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.565623 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c5f764c9-jl5ht"] Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.572493 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84c5f764c9-jl5ht"] Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.620629 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-dns-svc\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.620690 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-ovsdbserver-sb\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.620739 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-ovsdbserver-nb\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.620804 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-config\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.620827 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnnl7\" (UniqueName: \"kubernetes.io/projected/783d16b4-0ea1-451a-bedd-2f41f406b1ea-kube-api-access-xnnl7\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.622130 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-ovsdbserver-nb\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.622135 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-ovsdbserver-sb\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.622200 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-config\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.622727 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-dns-svc\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.637684 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnnl7\" (UniqueName: \"kubernetes.io/projected/783d16b4-0ea1-451a-bedd-2f41f406b1ea-kube-api-access-xnnl7\") pod \"dnsmasq-dns-55b8767675-ph6rj\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.808596 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.945028 4835 generic.go:334] "Generic (PLEG): container finished" podID="6ad2fc66-9ffc-4229-a5d0-63c8239c8c69" containerID="b9d82030ecfb6cd742b3a4a84cf66ccb34840dc1a73a88e161cc82b1a84a099c" exitCode=0 Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.945113 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gkx58" event={"ID":"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69","Type":"ContainerDied","Data":"b9d82030ecfb6cd742b3a4a84cf66ccb34840dc1a73a88e161cc82b1a84a099c"} Oct 03 18:30:43 crc kubenswrapper[4835]: I1003 18:30:43.990086 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7d48d677-h7jw9"] Oct 03 18:30:43 crc kubenswrapper[4835]: W1003 18:30:43.992961 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08ce9add_21a6_4c22_acce_436481ff3eda.slice/crio-7bc0ef61f7e2d42200a0706f0edf58ddb09c2e9f8db3dcc141bfc12cc424c333 WatchSource:0}: Error finding container 7bc0ef61f7e2d42200a0706f0edf58ddb09c2e9f8db3dcc141bfc12cc424c333: Status 404 returned error can't find the container with id 7bc0ef61f7e2d42200a0706f0edf58ddb09c2e9f8db3dcc141bfc12cc424c333 Oct 03 18:30:44 crc kubenswrapper[4835]: I1003 18:30:44.327634 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b8767675-ph6rj"] Oct 03 18:30:44 crc kubenswrapper[4835]: I1003 18:30:44.886136 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9757462c-6078-422a-9786-e82455a9bcd6" path="/var/lib/kubelet/pods/9757462c-6078-422a-9786-e82455a9bcd6/volumes" Oct 03 18:30:44 crc kubenswrapper[4835]: I1003 18:30:44.956905 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b8767675-ph6rj" event={"ID":"783d16b4-0ea1-451a-bedd-2f41f406b1ea","Type":"ContainerStarted","Data":"638943ac3617fb8294c26c7dd9a4dc4aa0a71dfe3ac234118ab57d2c0a068526"} Oct 03 18:30:44 crc kubenswrapper[4835]: I1003 18:30:44.958170 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" event={"ID":"08ce9add-21a6-4c22-acce-436481ff3eda","Type":"ContainerStarted","Data":"7bc0ef61f7e2d42200a0706f0edf58ddb09c2e9f8db3dcc141bfc12cc424c333"} Oct 03 18:30:45 crc kubenswrapper[4835]: I1003 18:30:45.508195 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:45 crc kubenswrapper[4835]: I1003 18:30:45.508246 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:45 crc kubenswrapper[4835]: I1003 18:30:45.550236 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:46 crc kubenswrapper[4835]: I1003 18:30:46.972287 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"90ba54d0-8893-4168-a725-993778708104","Type":"ContainerStarted","Data":"b29051c1f4d40040517a5a5ae98ab59bf3939fb08b4c88af57a99ab616572383"} Oct 03 18:30:49 crc kubenswrapper[4835]: I1003 18:30:49.995274 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2f5f99aa-dba6-465b-866a-1e293ba51685","Type":"ContainerStarted","Data":"6073b92c5fb7502902c8d7c4767ef2d79902b9a0b6bebdbfdacd3f52f2b31641"} Oct 03 18:30:49 crc kubenswrapper[4835]: I1003 18:30:49.996948 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1c6c5b2c-c368-4237-83ec-18ae3d06fe61","Type":"ContainerStarted","Data":"7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175"} Oct 03 18:30:49 crc kubenswrapper[4835]: I1003 18:30:49.999603 4835 generic.go:334] "Generic (PLEG): container finished" podID="08ce9add-21a6-4c22-acce-436481ff3eda" containerID="7920baadfc76c65d1fadb5136d8b6690f0066f002599a96254be1949858d515e" exitCode=0 Oct 03 18:30:49 crc kubenswrapper[4835]: I1003 18:30:49.999683 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" event={"ID":"08ce9add-21a6-4c22-acce-436481ff3eda","Type":"ContainerDied","Data":"7920baadfc76c65d1fadb5136d8b6690f0066f002599a96254be1949858d515e"} Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.002130 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6fd26bdb-868b-49db-9698-e7c79eea5cef","Type":"ContainerStarted","Data":"82cdd552bd8994820289189a6ad58b7da0d0849b9d8048609dea0a6514c0530d"} Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.006252 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gkx58" event={"ID":"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69","Type":"ContainerStarted","Data":"bc8eec80ea60b182cee3b898d4cd9a830a83b1c932c145a28aae709f4ef9a102"} Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.006307 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gkx58" event={"ID":"6ad2fc66-9ffc-4229-a5d0-63c8239c8c69","Type":"ContainerStarted","Data":"a9880109db9e90326023835089894038474b3683afbda7f68157c149d3128f3e"} Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.006445 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.008592 4835 generic.go:334] "Generic (PLEG): container finished" podID="783d16b4-0ea1-451a-bedd-2f41f406b1ea" containerID="921fc3736df3fd5d87a8fd6493529ec2eb7ee9283cd668994ce37713a72722d3" exitCode=0 Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.008676 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b8767675-ph6rj" event={"ID":"783d16b4-0ea1-451a-bedd-2f41f406b1ea","Type":"ContainerDied","Data":"921fc3736df3fd5d87a8fd6493529ec2eb7ee9283cd668994ce37713a72722d3"} Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.010863 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"b17ce629-9abd-42ba-8004-cc4b85cee405","Type":"ContainerStarted","Data":"5a870ffee6244949225426c57f6551b5424c7b1591920c9a5feadf9e78b8050c"} Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.076283 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-gkx58" podStartSLOduration=14.560579604 podStartE2EDuration="25.076262754s" podCreationTimestamp="2025-10-03 18:30:25 +0000 UTC" firstStartedPulling="2025-10-03 18:30:29.29757996 +0000 UTC m=+971.013520822" lastFinishedPulling="2025-10-03 18:30:39.8132631 +0000 UTC m=+981.529203972" observedRunningTime="2025-10-03 18:30:50.069578559 +0000 UTC m=+991.785519441" watchObservedRunningTime="2025-10-03 18:30:50.076262754 +0000 UTC m=+991.792203636" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.197327 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=14.938349391 podStartE2EDuration="25.197273288s" podCreationTimestamp="2025-10-03 18:30:25 +0000 UTC" firstStartedPulling="2025-10-03 18:30:29.984985629 +0000 UTC m=+971.700926501" lastFinishedPulling="2025-10-03 18:30:40.243909526 +0000 UTC m=+981.959850398" observedRunningTime="2025-10-03 18:30:50.184787711 +0000 UTC m=+991.900728603" watchObservedRunningTime="2025-10-03 18:30:50.197273288 +0000 UTC m=+991.913214160" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.427514 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.443100 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-config\") pod \"08ce9add-21a6-4c22-acce-436481ff3eda\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.443149 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s8fq\" (UniqueName: \"kubernetes.io/projected/08ce9add-21a6-4c22-acce-436481ff3eda-kube-api-access-9s8fq\") pod \"08ce9add-21a6-4c22-acce-436481ff3eda\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.443197 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-dns-svc\") pod \"08ce9add-21a6-4c22-acce-436481ff3eda\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.443275 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-ovsdbserver-nb\") pod \"08ce9add-21a6-4c22-acce-436481ff3eda\" (UID: \"08ce9add-21a6-4c22-acce-436481ff3eda\") " Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.455773 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ce9add-21a6-4c22-acce-436481ff3eda-kube-api-access-9s8fq" (OuterVolumeSpecName: "kube-api-access-9s8fq") pod "08ce9add-21a6-4c22-acce-436481ff3eda" (UID: "08ce9add-21a6-4c22-acce-436481ff3eda"). InnerVolumeSpecName "kube-api-access-9s8fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.463963 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-config" (OuterVolumeSpecName: "config") pod "08ce9add-21a6-4c22-acce-436481ff3eda" (UID: "08ce9add-21a6-4c22-acce-436481ff3eda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.464860 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08ce9add-21a6-4c22-acce-436481ff3eda" (UID: "08ce9add-21a6-4c22-acce-436481ff3eda"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.470629 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "08ce9add-21a6-4c22-acce-436481ff3eda" (UID: "08ce9add-21a6-4c22-acce-436481ff3eda"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.543496 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.544567 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.544599 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s8fq\" (UniqueName: \"kubernetes.io/projected/08ce9add-21a6-4c22-acce-436481ff3eda-kube-api-access-9s8fq\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.544622 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.544633 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08ce9add-21a6-4c22-acce-436481ff3eda-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.754125 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.859235 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.898798 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:30:50 crc kubenswrapper[4835]: I1003 18:30:50.900992 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:51 crc kubenswrapper[4835]: E1003 18:30:51.002907 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08ce9add_21a6_4c22_acce_436481ff3eda.slice\": RecentStats: unable to find data in memory cache]" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.027348 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b8767675-ph6rj" event={"ID":"783d16b4-0ea1-451a-bedd-2f41f406b1ea","Type":"ContainerStarted","Data":"c9ceb6e69acb8d84e2948964b9d882a317964cf44d417a35371b87c406fa5539"} Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.027464 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.030094 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" event={"ID":"08ce9add-21a6-4c22-acce-436481ff3eda","Type":"ContainerDied","Data":"7bc0ef61f7e2d42200a0706f0edf58ddb09c2e9f8db3dcc141bfc12cc424c333"} Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.030167 4835 scope.go:117] "RemoveContainer" containerID="7920baadfc76c65d1fadb5136d8b6690f0066f002599a96254be1949858d515e" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.030214 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7d48d677-h7jw9" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.031733 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.049377 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55b8767675-ph6rj" podStartSLOduration=8.049364105 podStartE2EDuration="8.049364105s" podCreationTimestamp="2025-10-03 18:30:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:30:51.049043437 +0000 UTC m=+992.764984319" watchObservedRunningTime="2025-10-03 18:30:51.049364105 +0000 UTC m=+992.765304977" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.097126 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7d48d677-h7jw9"] Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.097915 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.099364 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7d48d677-h7jw9"] Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.281539 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 03 18:30:51 crc kubenswrapper[4835]: E1003 18:30:51.282012 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ce9add-21a6-4c22-acce-436481ff3eda" containerName="init" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.282028 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ce9add-21a6-4c22-acce-436481ff3eda" containerName="init" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.282228 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ce9add-21a6-4c22-acce-436481ff3eda" containerName="init" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.283006 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.286934 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.287405 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.287425 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.287600 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gsftq" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.309286 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.356412 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pmgz\" (UniqueName: \"kubernetes.io/projected/d54a09d0-c069-432c-96e9-e742f143e2a9-kube-api-access-6pmgz\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.356562 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d54a09d0-c069-432c-96e9-e742f143e2a9-scripts\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.356607 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54a09d0-c069-432c-96e9-e742f143e2a9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.356663 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d54a09d0-c069-432c-96e9-e742f143e2a9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.356754 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54a09d0-c069-432c-96e9-e742f143e2a9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.356847 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54a09d0-c069-432c-96e9-e742f143e2a9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.356894 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54a09d0-c069-432c-96e9-e742f143e2a9-config\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.458890 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54a09d0-c069-432c-96e9-e742f143e2a9-config\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.459283 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pmgz\" (UniqueName: \"kubernetes.io/projected/d54a09d0-c069-432c-96e9-e742f143e2a9-kube-api-access-6pmgz\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.459395 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d54a09d0-c069-432c-96e9-e742f143e2a9-scripts\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.459426 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54a09d0-c069-432c-96e9-e742f143e2a9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.459481 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d54a09d0-c069-432c-96e9-e742f143e2a9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.459509 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54a09d0-c069-432c-96e9-e742f143e2a9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.459549 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54a09d0-c069-432c-96e9-e742f143e2a9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.460943 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d54a09d0-c069-432c-96e9-e742f143e2a9-scripts\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.461431 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d54a09d0-c069-432c-96e9-e742f143e2a9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.461522 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54a09d0-c069-432c-96e9-e742f143e2a9-config\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.467763 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d54a09d0-c069-432c-96e9-e742f143e2a9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.469119 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54a09d0-c069-432c-96e9-e742f143e2a9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.472055 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d54a09d0-c069-432c-96e9-e742f143e2a9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.477165 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pmgz\" (UniqueName: \"kubernetes.io/projected/d54a09d0-c069-432c-96e9-e742f143e2a9-kube-api-access-6pmgz\") pod \"ovn-northd-0\" (UID: \"d54a09d0-c069-432c-96e9-e742f143e2a9\") " pod="openstack/ovn-northd-0" Oct 03 18:30:51 crc kubenswrapper[4835]: I1003 18:30:51.602008 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.055103 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.374209 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.418566 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55b8767675-ph6rj"] Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.430957 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4d484c5-nhr9s"] Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.432241 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.491734 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxfv9\" (UniqueName: \"kubernetes.io/projected/dbd81335-256c-4b39-bb05-096ba524b652-kube-api-access-jxfv9\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.491820 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-dns-svc\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.491855 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.491887 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-config\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.491918 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.499014 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4d484c5-nhr9s"] Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.593847 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.593910 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-config\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.594124 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.594287 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxfv9\" (UniqueName: \"kubernetes.io/projected/dbd81335-256c-4b39-bb05-096ba524b652-kube-api-access-jxfv9\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.594437 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-dns-svc\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.594794 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-config\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.594887 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.595139 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.595632 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-dns-svc\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.613112 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxfv9\" (UniqueName: \"kubernetes.io/projected/dbd81335-256c-4b39-bb05-096ba524b652-kube-api-access-jxfv9\") pod \"dnsmasq-dns-6bb4d484c5-nhr9s\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.759928 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:52 crc kubenswrapper[4835]: I1003 18:30:52.920231 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08ce9add-21a6-4c22-acce-436481ff3eda" path="/var/lib/kubelet/pods/08ce9add-21a6-4c22-acce-436481ff3eda/volumes" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.050367 4835 generic.go:334] "Generic (PLEG): container finished" podID="d91a9a1f-a39c-4a80-8bf4-1196bacc8870" containerID="13f976e8f4c650af2a7d50e9ca49607e7aaf618c495a2f98ea3e3c4f46d565e8" exitCode=0 Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.050430 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d91a9a1f-a39c-4a80-8bf4-1196bacc8870","Type":"ContainerDied","Data":"13f976e8f4c650af2a7d50e9ca49607e7aaf618c495a2f98ea3e3c4f46d565e8"} Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.052760 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d54a09d0-c069-432c-96e9-e742f143e2a9","Type":"ContainerStarted","Data":"76bb3fafcc1483930c9f7fb1e878865cc9e6ab244dd85aab1a430700c52b85a6"} Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.052971 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55b8767675-ph6rj" podUID="783d16b4-0ea1-451a-bedd-2f41f406b1ea" containerName="dnsmasq-dns" containerID="cri-o://c9ceb6e69acb8d84e2948964b9d882a317964cf44d417a35371b87c406fa5539" gracePeriod=10 Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.281627 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4d484c5-nhr9s"] Oct 03 18:30:53 crc kubenswrapper[4835]: W1003 18:30:53.296639 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbd81335_256c_4b39_bb05_096ba524b652.slice/crio-03196989d955ed020a12a52b2170aa2ad5a65737f0fcfd6ba059006f5bda2e31 WatchSource:0}: Error finding container 03196989d955ed020a12a52b2170aa2ad5a65737f0fcfd6ba059006f5bda2e31: Status 404 returned error can't find the container with id 03196989d955ed020a12a52b2170aa2ad5a65737f0fcfd6ba059006f5bda2e31 Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.482488 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.532796 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 03 18:30:53 crc kubenswrapper[4835]: E1003 18:30:53.533483 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783d16b4-0ea1-451a-bedd-2f41f406b1ea" containerName="dnsmasq-dns" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.533502 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="783d16b4-0ea1-451a-bedd-2f41f406b1ea" containerName="dnsmasq-dns" Oct 03 18:30:53 crc kubenswrapper[4835]: E1003 18:30:53.533515 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783d16b4-0ea1-451a-bedd-2f41f406b1ea" containerName="init" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.533522 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="783d16b4-0ea1-451a-bedd-2f41f406b1ea" containerName="init" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.533898 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="783d16b4-0ea1-451a-bedd-2f41f406b1ea" containerName="dnsmasq-dns" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.542501 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.545139 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-pjx5h" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.550154 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.550472 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.551273 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.553136 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.614294 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnnl7\" (UniqueName: \"kubernetes.io/projected/783d16b4-0ea1-451a-bedd-2f41f406b1ea-kube-api-access-xnnl7\") pod \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.614391 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-config\") pod \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.614443 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-dns-svc\") pod \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.614528 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-ovsdbserver-nb\") pod \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.614584 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-ovsdbserver-sb\") pod \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\" (UID: \"783d16b4-0ea1-451a-bedd-2f41f406b1ea\") " Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.615433 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkcql\" (UniqueName: \"kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-kube-api-access-pkcql\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.615490 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.615559 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c8359224-c77a-4d86-878b-6f073225ed33-cache\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.615629 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.615703 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c8359224-c77a-4d86-878b-6f073225ed33-lock\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.636767 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783d16b4-0ea1-451a-bedd-2f41f406b1ea-kube-api-access-xnnl7" (OuterVolumeSpecName: "kube-api-access-xnnl7") pod "783d16b4-0ea1-451a-bedd-2f41f406b1ea" (UID: "783d16b4-0ea1-451a-bedd-2f41f406b1ea"). InnerVolumeSpecName "kube-api-access-xnnl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.656392 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-config" (OuterVolumeSpecName: "config") pod "783d16b4-0ea1-451a-bedd-2f41f406b1ea" (UID: "783d16b4-0ea1-451a-bedd-2f41f406b1ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.658254 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "783d16b4-0ea1-451a-bedd-2f41f406b1ea" (UID: "783d16b4-0ea1-451a-bedd-2f41f406b1ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.677217 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "783d16b4-0ea1-451a-bedd-2f41f406b1ea" (UID: "783d16b4-0ea1-451a-bedd-2f41f406b1ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.683591 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "783d16b4-0ea1-451a-bedd-2f41f406b1ea" (UID: "783d16b4-0ea1-451a-bedd-2f41f406b1ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.716662 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c8359224-c77a-4d86-878b-6f073225ed33-lock\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.716722 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkcql\" (UniqueName: \"kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-kube-api-access-pkcql\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.716760 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.716801 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c8359224-c77a-4d86-878b-6f073225ed33-cache\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.716858 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.716909 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnnl7\" (UniqueName: \"kubernetes.io/projected/783d16b4-0ea1-451a-bedd-2f41f406b1ea-kube-api-access-xnnl7\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.716922 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.716932 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.716941 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.716949 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/783d16b4-0ea1-451a-bedd-2f41f406b1ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 18:30:53 crc kubenswrapper[4835]: E1003 18:30:53.717060 4835 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 18:30:53 crc kubenswrapper[4835]: E1003 18:30:53.717092 4835 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 18:30:53 crc kubenswrapper[4835]: E1003 18:30:53.717139 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift podName:c8359224-c77a-4d86-878b-6f073225ed33 nodeName:}" failed. No retries permitted until 2025-10-03 18:30:54.217120875 +0000 UTC m=+995.933061747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift") pod "swift-storage-0" (UID: "c8359224-c77a-4d86-878b-6f073225ed33") : configmap "swift-ring-files" not found Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.717214 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c8359224-c77a-4d86-878b-6f073225ed33-lock\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.717450 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.717572 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c8359224-c77a-4d86-878b-6f073225ed33-cache\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.735505 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkcql\" (UniqueName: \"kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-kube-api-access-pkcql\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.738540 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.839723 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mflzl"] Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.865575 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.870456 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.881794 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.887251 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.890779 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mflzl"] Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.920497 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-dispersionconf\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.920611 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsdgw\" (UniqueName: \"kubernetes.io/projected/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-kube-api-access-hsdgw\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.920650 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-ring-data-devices\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.920692 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-scripts\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.920717 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-combined-ca-bundle\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.920766 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-swiftconf\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:53 crc kubenswrapper[4835]: I1003 18:30:53.920793 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-etc-swift\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.023899 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-etc-swift\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.023967 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-dispersionconf\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.024028 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsdgw\" (UniqueName: \"kubernetes.io/projected/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-kube-api-access-hsdgw\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.024059 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-ring-data-devices\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.024116 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-scripts\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.024139 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-combined-ca-bundle\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.024173 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-swiftconf\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.026010 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-etc-swift\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.026662 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-ring-data-devices\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.026841 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-scripts\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.034638 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-swiftconf\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.036636 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-dispersionconf\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.038599 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-combined-ca-bundle\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.049645 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsdgw\" (UniqueName: \"kubernetes.io/projected/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-kube-api-access-hsdgw\") pod \"swift-ring-rebalance-mflzl\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.060930 4835 generic.go:334] "Generic (PLEG): container finished" podID="783d16b4-0ea1-451a-bedd-2f41f406b1ea" containerID="c9ceb6e69acb8d84e2948964b9d882a317964cf44d417a35371b87c406fa5539" exitCode=0 Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.060994 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b8767675-ph6rj" event={"ID":"783d16b4-0ea1-451a-bedd-2f41f406b1ea","Type":"ContainerDied","Data":"c9ceb6e69acb8d84e2948964b9d882a317964cf44d417a35371b87c406fa5539"} Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.061019 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b8767675-ph6rj" event={"ID":"783d16b4-0ea1-451a-bedd-2f41f406b1ea","Type":"ContainerDied","Data":"638943ac3617fb8294c26c7dd9a4dc4aa0a71dfe3ac234118ab57d2c0a068526"} Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.061018 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b8767675-ph6rj" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.061036 4835 scope.go:117] "RemoveContainer" containerID="c9ceb6e69acb8d84e2948964b9d882a317964cf44d417a35371b87c406fa5539" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.062562 4835 generic.go:334] "Generic (PLEG): container finished" podID="71cb8688-6214-4e5e-a7da-051c5939df65" containerID="702e58af99e2d7418514cc723a5ef6acb1b6115a91ee87e62a4b5727a7a37205" exitCode=0 Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.062611 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"71cb8688-6214-4e5e-a7da-051c5939df65","Type":"ContainerDied","Data":"702e58af99e2d7418514cc723a5ef6acb1b6115a91ee87e62a4b5727a7a37205"} Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.064703 4835 generic.go:334] "Generic (PLEG): container finished" podID="dbd81335-256c-4b39-bb05-096ba524b652" containerID="02bdb55047f68f3bbab1cd69d2927a82fccb8f5e5d281a634332057c9fcdcd00" exitCode=0 Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.064793 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" event={"ID":"dbd81335-256c-4b39-bb05-096ba524b652","Type":"ContainerDied","Data":"02bdb55047f68f3bbab1cd69d2927a82fccb8f5e5d281a634332057c9fcdcd00"} Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.064815 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" event={"ID":"dbd81335-256c-4b39-bb05-096ba524b652","Type":"ContainerStarted","Data":"03196989d955ed020a12a52b2170aa2ad5a65737f0fcfd6ba059006f5bda2e31"} Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.068749 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d91a9a1f-a39c-4a80-8bf4-1196bacc8870","Type":"ContainerStarted","Data":"e75c0e8c9381150dd9f0da589ae3ebb673efcc1a028a2878657b799ba49b9362"} Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.070460 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d54a09d0-c069-432c-96e9-e742f143e2a9","Type":"ContainerStarted","Data":"bbc4a018057953e88a5edff35a4343cb85839927ff2dc15342bb19926c4e11b6"} Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.070512 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d54a09d0-c069-432c-96e9-e742f143e2a9","Type":"ContainerStarted","Data":"c3a99a34c0ce10d6de470f289f5704ec971248c2e68d71a7a3255009bbd8f519"} Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.070719 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.078451 4835 scope.go:117] "RemoveContainer" containerID="921fc3736df3fd5d87a8fd6493529ec2eb7ee9283cd668994ce37713a72722d3" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.111474 4835 scope.go:117] "RemoveContainer" containerID="c9ceb6e69acb8d84e2948964b9d882a317964cf44d417a35371b87c406fa5539" Oct 03 18:30:54 crc kubenswrapper[4835]: E1003 18:30:54.114965 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9ceb6e69acb8d84e2948964b9d882a317964cf44d417a35371b87c406fa5539\": container with ID starting with c9ceb6e69acb8d84e2948964b9d882a317964cf44d417a35371b87c406fa5539 not found: ID does not exist" containerID="c9ceb6e69acb8d84e2948964b9d882a317964cf44d417a35371b87c406fa5539" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.115002 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9ceb6e69acb8d84e2948964b9d882a317964cf44d417a35371b87c406fa5539"} err="failed to get container status \"c9ceb6e69acb8d84e2948964b9d882a317964cf44d417a35371b87c406fa5539\": rpc error: code = NotFound desc = could not find container \"c9ceb6e69acb8d84e2948964b9d882a317964cf44d417a35371b87c406fa5539\": container with ID starting with c9ceb6e69acb8d84e2948964b9d882a317964cf44d417a35371b87c406fa5539 not found: ID does not exist" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.115027 4835 scope.go:117] "RemoveContainer" containerID="921fc3736df3fd5d87a8fd6493529ec2eb7ee9283cd668994ce37713a72722d3" Oct 03 18:30:54 crc kubenswrapper[4835]: E1003 18:30:54.116748 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"921fc3736df3fd5d87a8fd6493529ec2eb7ee9283cd668994ce37713a72722d3\": container with ID starting with 921fc3736df3fd5d87a8fd6493529ec2eb7ee9283cd668994ce37713a72722d3 not found: ID does not exist" containerID="921fc3736df3fd5d87a8fd6493529ec2eb7ee9283cd668994ce37713a72722d3" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.116792 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"921fc3736df3fd5d87a8fd6493529ec2eb7ee9283cd668994ce37713a72722d3"} err="failed to get container status \"921fc3736df3fd5d87a8fd6493529ec2eb7ee9283cd668994ce37713a72722d3\": rpc error: code = NotFound desc = could not find container \"921fc3736df3fd5d87a8fd6493529ec2eb7ee9283cd668994ce37713a72722d3\": container with ID starting with 921fc3736df3fd5d87a8fd6493529ec2eb7ee9283cd668994ce37713a72722d3 not found: ID does not exist" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.139535 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.774530218 podStartE2EDuration="35.139518779s" podCreationTimestamp="2025-10-03 18:30:19 +0000 UTC" firstStartedPulling="2025-10-03 18:30:29.002241424 +0000 UTC m=+970.718182296" lastFinishedPulling="2025-10-03 18:30:39.367229985 +0000 UTC m=+981.083170857" observedRunningTime="2025-10-03 18:30:54.133252675 +0000 UTC m=+995.849193547" watchObservedRunningTime="2025-10-03 18:30:54.139518779 +0000 UTC m=+995.855459641" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.152143 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55b8767675-ph6rj"] Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.163105 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55b8767675-ph6rj"] Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.163585 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.389789328 podStartE2EDuration="3.16356252s" podCreationTimestamp="2025-10-03 18:30:51 +0000 UTC" firstStartedPulling="2025-10-03 18:30:52.061307471 +0000 UTC m=+993.777248343" lastFinishedPulling="2025-10-03 18:30:52.835080663 +0000 UTC m=+994.551021535" observedRunningTime="2025-10-03 18:30:54.162723219 +0000 UTC m=+995.878664101" watchObservedRunningTime="2025-10-03 18:30:54.16356252 +0000 UTC m=+995.879503382" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.209949 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.227651 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:54 crc kubenswrapper[4835]: E1003 18:30:54.228249 4835 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 18:30:54 crc kubenswrapper[4835]: E1003 18:30:54.228274 4835 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 18:30:54 crc kubenswrapper[4835]: E1003 18:30:54.228315 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift podName:c8359224-c77a-4d86-878b-6f073225ed33 nodeName:}" failed. No retries permitted until 2025-10-03 18:30:55.228298521 +0000 UTC m=+996.944239393 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift") pod "swift-storage-0" (UID: "c8359224-c77a-4d86-878b-6f073225ed33") : configmap "swift-ring-files" not found Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.640242 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mflzl"] Oct 03 18:30:54 crc kubenswrapper[4835]: W1003 18:30:54.640282 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac2daa29_f8b3_4aa7_a4ac_f75d003fb6f2.slice/crio-864087d9e2ff73ef29976fea1561a41ad860f3a5d3194f55001031973be060c0 WatchSource:0}: Error finding container 864087d9e2ff73ef29976fea1561a41ad860f3a5d3194f55001031973be060c0: Status 404 returned error can't find the container with id 864087d9e2ff73ef29976fea1561a41ad860f3a5d3194f55001031973be060c0 Oct 03 18:30:54 crc kubenswrapper[4835]: I1003 18:30:54.890152 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783d16b4-0ea1-451a-bedd-2f41f406b1ea" path="/var/lib/kubelet/pods/783d16b4-0ea1-451a-bedd-2f41f406b1ea/volumes" Oct 03 18:30:55 crc kubenswrapper[4835]: I1003 18:30:55.083591 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" event={"ID":"dbd81335-256c-4b39-bb05-096ba524b652","Type":"ContainerStarted","Data":"c03fbcfbbc7d49b7a970051bce876846f100e18c6ef3d83a8388563ff2e963af"} Oct 03 18:30:55 crc kubenswrapper[4835]: I1003 18:30:55.083731 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:30:55 crc kubenswrapper[4835]: I1003 18:30:55.087572 4835 generic.go:334] "Generic (PLEG): container finished" podID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerID="7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175" exitCode=0 Oct 03 18:30:55 crc kubenswrapper[4835]: I1003 18:30:55.087656 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1c6c5b2c-c368-4237-83ec-18ae3d06fe61","Type":"ContainerDied","Data":"7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175"} Oct 03 18:30:55 crc kubenswrapper[4835]: I1003 18:30:55.096523 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"71cb8688-6214-4e5e-a7da-051c5939df65","Type":"ContainerStarted","Data":"2accb32ba147ca399ac3d4808e70778b40adfd30402810a6439e1e9ba4a8a860"} Oct 03 18:30:55 crc kubenswrapper[4835]: I1003 18:30:55.103717 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" podStartSLOduration=3.103698111 podStartE2EDuration="3.103698111s" podCreationTimestamp="2025-10-03 18:30:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:30:55.098991286 +0000 UTC m=+996.814932178" watchObservedRunningTime="2025-10-03 18:30:55.103698111 +0000 UTC m=+996.819638983" Oct 03 18:30:55 crc kubenswrapper[4835]: I1003 18:30:55.125128 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mflzl" event={"ID":"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2","Type":"ContainerStarted","Data":"864087d9e2ff73ef29976fea1561a41ad860f3a5d3194f55001031973be060c0"} Oct 03 18:30:55 crc kubenswrapper[4835]: I1003 18:30:55.171954 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.137884479 podStartE2EDuration="37.171936759s" podCreationTimestamp="2025-10-03 18:30:18 +0000 UTC" firstStartedPulling="2025-10-03 18:30:29.049525085 +0000 UTC m=+970.765465957" lastFinishedPulling="2025-10-03 18:30:40.083577365 +0000 UTC m=+981.799518237" observedRunningTime="2025-10-03 18:30:55.141919221 +0000 UTC m=+996.857860093" watchObservedRunningTime="2025-10-03 18:30:55.171936759 +0000 UTC m=+996.887877631" Oct 03 18:30:55 crc kubenswrapper[4835]: I1003 18:30:55.242489 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:55 crc kubenswrapper[4835]: E1003 18:30:55.242898 4835 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 18:30:55 crc kubenswrapper[4835]: E1003 18:30:55.242921 4835 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 18:30:55 crc kubenswrapper[4835]: E1003 18:30:55.242961 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift podName:c8359224-c77a-4d86-878b-6f073225ed33 nodeName:}" failed. No retries permitted until 2025-10-03 18:30:57.242945844 +0000 UTC m=+998.958886826 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift") pod "swift-storage-0" (UID: "c8359224-c77a-4d86-878b-6f073225ed33") : configmap "swift-ring-files" not found Oct 03 18:30:56 crc kubenswrapper[4835]: E1003 18:30:56.861624 4835 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.173:54556->38.102.83.173:34081: write tcp 38.102.83.173:54556->38.102.83.173:34081: write: connection reset by peer Oct 03 18:30:57 crc kubenswrapper[4835]: I1003 18:30:57.142513 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mflzl" event={"ID":"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2","Type":"ContainerStarted","Data":"15f82a6c31b176d2f4801fab274b517447313fd33088f4d0a9cf54beec3eb9ac"} Oct 03 18:30:57 crc kubenswrapper[4835]: I1003 18:30:57.161417 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mflzl" podStartSLOduration=2.012427167 podStartE2EDuration="4.161400775s" podCreationTimestamp="2025-10-03 18:30:53 +0000 UTC" firstStartedPulling="2025-10-03 18:30:54.642824161 +0000 UTC m=+996.358765033" lastFinishedPulling="2025-10-03 18:30:56.791797779 +0000 UTC m=+998.507738641" observedRunningTime="2025-10-03 18:30:57.16040527 +0000 UTC m=+998.876346152" watchObservedRunningTime="2025-10-03 18:30:57.161400775 +0000 UTC m=+998.877341647" Oct 03 18:30:57 crc kubenswrapper[4835]: I1003 18:30:57.286259 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:30:57 crc kubenswrapper[4835]: E1003 18:30:57.286475 4835 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 18:30:57 crc kubenswrapper[4835]: E1003 18:30:57.286642 4835 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 18:30:57 crc kubenswrapper[4835]: E1003 18:30:57.286711 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift podName:c8359224-c77a-4d86-878b-6f073225ed33 nodeName:}" failed. No retries permitted until 2025-10-03 18:31:01.286696194 +0000 UTC m=+1003.002637066 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift") pod "swift-storage-0" (UID: "c8359224-c77a-4d86-878b-6f073225ed33") : configmap "swift-ring-files" not found Oct 03 18:30:59 crc kubenswrapper[4835]: I1003 18:30:59.968920 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 03 18:30:59 crc kubenswrapper[4835]: I1003 18:30:59.969398 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.052568 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.225511 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.363563 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.363627 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.466957 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-2qv8l"] Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.468266 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2qv8l" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.471647 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2qv8l"] Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.542361 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwwdk\" (UniqueName: \"kubernetes.io/projected/ef013310-2986-4594-a8a2-f9d0f9f686df-kube-api-access-wwwdk\") pod \"keystone-db-create-2qv8l\" (UID: \"ef013310-2986-4594-a8a2-f9d0f9f686df\") " pod="openstack/keystone-db-create-2qv8l" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.643904 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwwdk\" (UniqueName: \"kubernetes.io/projected/ef013310-2986-4594-a8a2-f9d0f9f686df-kube-api-access-wwwdk\") pod \"keystone-db-create-2qv8l\" (UID: \"ef013310-2986-4594-a8a2-f9d0f9f686df\") " pod="openstack/keystone-db-create-2qv8l" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.664771 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-676cv"] Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.665910 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-676cv" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.678544 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwwdk\" (UniqueName: \"kubernetes.io/projected/ef013310-2986-4594-a8a2-f9d0f9f686df-kube-api-access-wwwdk\") pod \"keystone-db-create-2qv8l\" (UID: \"ef013310-2986-4594-a8a2-f9d0f9f686df\") " pod="openstack/keystone-db-create-2qv8l" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.679062 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-676cv"] Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.746194 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ggz6\" (UniqueName: \"kubernetes.io/projected/7c3b62d6-8a47-48b7-8bb7-ffc031a67eef-kube-api-access-2ggz6\") pod \"placement-db-create-676cv\" (UID: \"7c3b62d6-8a47-48b7-8bb7-ffc031a67eef\") " pod="openstack/placement-db-create-676cv" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.788685 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2qv8l" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.849328 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ggz6\" (UniqueName: \"kubernetes.io/projected/7c3b62d6-8a47-48b7-8bb7-ffc031a67eef-kube-api-access-2ggz6\") pod \"placement-db-create-676cv\" (UID: \"7c3b62d6-8a47-48b7-8bb7-ffc031a67eef\") " pod="openstack/placement-db-create-676cv" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.865233 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ggz6\" (UniqueName: \"kubernetes.io/projected/7c3b62d6-8a47-48b7-8bb7-ffc031a67eef-kube-api-access-2ggz6\") pod \"placement-db-create-676cv\" (UID: \"7c3b62d6-8a47-48b7-8bb7-ffc031a67eef\") " pod="openstack/placement-db-create-676cv" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.972852 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-n5gjl"] Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.974310 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n5gjl" Oct 03 18:31:00 crc kubenswrapper[4835]: I1003 18:31:00.978100 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n5gjl"] Oct 03 18:31:01 crc kubenswrapper[4835]: I1003 18:31:01.014771 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-676cv" Oct 03 18:31:01 crc kubenswrapper[4835]: I1003 18:31:01.052406 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxtdx\" (UniqueName: \"kubernetes.io/projected/991a2cf0-ae6d-4c94-8bc2-f0f605077c13-kube-api-access-wxtdx\") pod \"glance-db-create-n5gjl\" (UID: \"991a2cf0-ae6d-4c94-8bc2-f0f605077c13\") " pod="openstack/glance-db-create-n5gjl" Oct 03 18:31:01 crc kubenswrapper[4835]: I1003 18:31:01.153917 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxtdx\" (UniqueName: \"kubernetes.io/projected/991a2cf0-ae6d-4c94-8bc2-f0f605077c13-kube-api-access-wxtdx\") pod \"glance-db-create-n5gjl\" (UID: \"991a2cf0-ae6d-4c94-8bc2-f0f605077c13\") " pod="openstack/glance-db-create-n5gjl" Oct 03 18:31:01 crc kubenswrapper[4835]: I1003 18:31:01.176292 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxtdx\" (UniqueName: \"kubernetes.io/projected/991a2cf0-ae6d-4c94-8bc2-f0f605077c13-kube-api-access-wxtdx\") pod \"glance-db-create-n5gjl\" (UID: \"991a2cf0-ae6d-4c94-8bc2-f0f605077c13\") " pod="openstack/glance-db-create-n5gjl" Oct 03 18:31:01 crc kubenswrapper[4835]: I1003 18:31:01.236427 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 03 18:31:01 crc kubenswrapper[4835]: I1003 18:31:01.296856 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n5gjl" Oct 03 18:31:01 crc kubenswrapper[4835]: I1003 18:31:01.301380 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 03 18:31:01 crc kubenswrapper[4835]: I1003 18:31:01.357644 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:31:01 crc kubenswrapper[4835]: E1003 18:31:01.357898 4835 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 18:31:01 crc kubenswrapper[4835]: E1003 18:31:01.358008 4835 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 18:31:01 crc kubenswrapper[4835]: E1003 18:31:01.358122 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift podName:c8359224-c77a-4d86-878b-6f073225ed33 nodeName:}" failed. No retries permitted until 2025-10-03 18:31:09.35810365 +0000 UTC m=+1011.074044512 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift") pod "swift-storage-0" (UID: "c8359224-c77a-4d86-878b-6f073225ed33") : configmap "swift-ring-files" not found Oct 03 18:31:02 crc kubenswrapper[4835]: I1003 18:31:02.355008 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-jfv8f"] Oct 03 18:31:02 crc kubenswrapper[4835]: I1003 18:31:02.357506 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-jfv8f" Oct 03 18:31:02 crc kubenswrapper[4835]: I1003 18:31:02.361166 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-jfv8f"] Oct 03 18:31:02 crc kubenswrapper[4835]: I1003 18:31:02.477274 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x89b6\" (UniqueName: \"kubernetes.io/projected/8906174e-7e3e-4c75-916d-b9dc0428ed75-kube-api-access-x89b6\") pod \"watcher-db-create-jfv8f\" (UID: \"8906174e-7e3e-4c75-916d-b9dc0428ed75\") " pod="openstack/watcher-db-create-jfv8f" Oct 03 18:31:02 crc kubenswrapper[4835]: I1003 18:31:02.578835 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x89b6\" (UniqueName: \"kubernetes.io/projected/8906174e-7e3e-4c75-916d-b9dc0428ed75-kube-api-access-x89b6\") pod \"watcher-db-create-jfv8f\" (UID: \"8906174e-7e3e-4c75-916d-b9dc0428ed75\") " pod="openstack/watcher-db-create-jfv8f" Oct 03 18:31:02 crc kubenswrapper[4835]: I1003 18:31:02.608699 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x89b6\" (UniqueName: \"kubernetes.io/projected/8906174e-7e3e-4c75-916d-b9dc0428ed75-kube-api-access-x89b6\") pod \"watcher-db-create-jfv8f\" (UID: \"8906174e-7e3e-4c75-916d-b9dc0428ed75\") " pod="openstack/watcher-db-create-jfv8f" Oct 03 18:31:02 crc kubenswrapper[4835]: I1003 18:31:02.688562 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-jfv8f" Oct 03 18:31:02 crc kubenswrapper[4835]: I1003 18:31:02.762268 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:31:02 crc kubenswrapper[4835]: I1003 18:31:02.815830 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6564457b49-fg4vg"] Oct 03 18:31:02 crc kubenswrapper[4835]: I1003 18:31:02.816055 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6564457b49-fg4vg" podUID="ba05d20e-0cd7-4c1a-bee5-45f439b42518" containerName="dnsmasq-dns" containerID="cri-o://2fff0a00a2a2536ac8123420f6a9a2c9c3134e387db28acdc79587c567deaf56" gracePeriod=10 Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.052914 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2qv8l"] Oct 03 18:31:03 crc kubenswrapper[4835]: W1003 18:31:03.061157 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef013310_2986_4594_a8a2_f9d0f9f686df.slice/crio-90c11c3ae82a42f8483e3c47a68c52aedfcc801918aba34d5501ff3858539251 WatchSource:0}: Error finding container 90c11c3ae82a42f8483e3c47a68c52aedfcc801918aba34d5501ff3858539251: Status 404 returned error can't find the container with id 90c11c3ae82a42f8483e3c47a68c52aedfcc801918aba34d5501ff3858539251 Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.195585 4835 generic.go:334] "Generic (PLEG): container finished" podID="ba05d20e-0cd7-4c1a-bee5-45f439b42518" containerID="2fff0a00a2a2536ac8123420f6a9a2c9c3134e387db28acdc79587c567deaf56" exitCode=0 Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.195635 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6564457b49-fg4vg" event={"ID":"ba05d20e-0cd7-4c1a-bee5-45f439b42518","Type":"ContainerDied","Data":"2fff0a00a2a2536ac8123420f6a9a2c9c3134e387db28acdc79587c567deaf56"} Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.197372 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2qv8l" event={"ID":"ef013310-2986-4594-a8a2-f9d0f9f686df","Type":"ContainerStarted","Data":"90c11c3ae82a42f8483e3c47a68c52aedfcc801918aba34d5501ff3858539251"} Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.198760 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1c6c5b2c-c368-4237-83ec-18ae3d06fe61","Type":"ContainerStarted","Data":"07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a"} Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.301523 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.338261 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-jfv8f"] Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.357439 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-676cv"] Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.367822 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-n5gjl"] Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.391609 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba05d20e-0cd7-4c1a-bee5-45f439b42518-config\") pod \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\" (UID: \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\") " Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.391797 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhjnb\" (UniqueName: \"kubernetes.io/projected/ba05d20e-0cd7-4c1a-bee5-45f439b42518-kube-api-access-xhjnb\") pod \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\" (UID: \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\") " Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.391857 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba05d20e-0cd7-4c1a-bee5-45f439b42518-dns-svc\") pod \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\" (UID: \"ba05d20e-0cd7-4c1a-bee5-45f439b42518\") " Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.396906 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba05d20e-0cd7-4c1a-bee5-45f439b42518-kube-api-access-xhjnb" (OuterVolumeSpecName: "kube-api-access-xhjnb") pod "ba05d20e-0cd7-4c1a-bee5-45f439b42518" (UID: "ba05d20e-0cd7-4c1a-bee5-45f439b42518"). InnerVolumeSpecName "kube-api-access-xhjnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.441334 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba05d20e-0cd7-4c1a-bee5-45f439b42518-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba05d20e-0cd7-4c1a-bee5-45f439b42518" (UID: "ba05d20e-0cd7-4c1a-bee5-45f439b42518"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.453627 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba05d20e-0cd7-4c1a-bee5-45f439b42518-config" (OuterVolumeSpecName: "config") pod "ba05d20e-0cd7-4c1a-bee5-45f439b42518" (UID: "ba05d20e-0cd7-4c1a-bee5-45f439b42518"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.493437 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhjnb\" (UniqueName: \"kubernetes.io/projected/ba05d20e-0cd7-4c1a-bee5-45f439b42518-kube-api-access-xhjnb\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.493469 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba05d20e-0cd7-4c1a-bee5-45f439b42518-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:03 crc kubenswrapper[4835]: I1003 18:31:03.493480 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba05d20e-0cd7-4c1a-bee5-45f439b42518-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.208528 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6564457b49-fg4vg" Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.208589 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6564457b49-fg4vg" event={"ID":"ba05d20e-0cd7-4c1a-bee5-45f439b42518","Type":"ContainerDied","Data":"d2f8415eb3d32205846b4101855825f27fa43c43f90138bd0b87b84279d7efd5"} Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.208658 4835 scope.go:117] "RemoveContainer" containerID="2fff0a00a2a2536ac8123420f6a9a2c9c3134e387db28acdc79587c567deaf56" Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.236183 4835 generic.go:334] "Generic (PLEG): container finished" podID="ef013310-2986-4594-a8a2-f9d0f9f686df" containerID="7489ec8d94a3bcbb02f511b57c58518fbc1e5a9783369f1dbc69e8bb2c3dc3ba" exitCode=0 Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.236348 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2qv8l" event={"ID":"ef013310-2986-4594-a8a2-f9d0f9f686df","Type":"ContainerDied","Data":"7489ec8d94a3bcbb02f511b57c58518fbc1e5a9783369f1dbc69e8bb2c3dc3ba"} Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.236642 4835 scope.go:117] "RemoveContainer" containerID="cfaf8ddd76234769f32c6fbbc1b11ea5a0c3ab4eceb65c985e90b4146de76573" Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.240250 4835 generic.go:334] "Generic (PLEG): container finished" podID="991a2cf0-ae6d-4c94-8bc2-f0f605077c13" containerID="f872b59b883c8e4f20f046bbe9244e290359ac55a59d667c9bf1eedde534a9d4" exitCode=0 Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.240308 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n5gjl" event={"ID":"991a2cf0-ae6d-4c94-8bc2-f0f605077c13","Type":"ContainerDied","Data":"f872b59b883c8e4f20f046bbe9244e290359ac55a59d667c9bf1eedde534a9d4"} Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.240331 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n5gjl" event={"ID":"991a2cf0-ae6d-4c94-8bc2-f0f605077c13","Type":"ContainerStarted","Data":"ee9101bf69f1937de6ba9baef81952650e9edeb033ac03a99c39adff72f7a430"} Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.242814 4835 generic.go:334] "Generic (PLEG): container finished" podID="7c3b62d6-8a47-48b7-8bb7-ffc031a67eef" containerID="449f4e38da63229c20e564420d06245b82718e77ca552ccf9765b7c84cae3e3e" exitCode=0 Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.242882 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-676cv" event={"ID":"7c3b62d6-8a47-48b7-8bb7-ffc031a67eef","Type":"ContainerDied","Data":"449f4e38da63229c20e564420d06245b82718e77ca552ccf9765b7c84cae3e3e"} Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.242907 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-676cv" event={"ID":"7c3b62d6-8a47-48b7-8bb7-ffc031a67eef","Type":"ContainerStarted","Data":"75994ecf141da45d37df956571ede31b4e7893683546672c819773db5848ac18"} Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.252354 4835 generic.go:334] "Generic (PLEG): container finished" podID="8906174e-7e3e-4c75-916d-b9dc0428ed75" containerID="2eee8ab267f08fb7635468a90b258465d6557cb2b615418f5f894c8d3cba641e" exitCode=0 Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.252412 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-jfv8f" event={"ID":"8906174e-7e3e-4c75-916d-b9dc0428ed75","Type":"ContainerDied","Data":"2eee8ab267f08fb7635468a90b258465d6557cb2b615418f5f894c8d3cba641e"} Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.252437 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-jfv8f" event={"ID":"8906174e-7e3e-4c75-916d-b9dc0428ed75","Type":"ContainerStarted","Data":"1f9ea4fe606d8a33570b292eb59926c26de5222b6a532bd049a3c1ea272fc042"} Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.260744 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6564457b49-fg4vg"] Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.265410 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6564457b49-fg4vg"] Oct 03 18:31:04 crc kubenswrapper[4835]: I1003 18:31:04.886370 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba05d20e-0cd7-4c1a-bee5-45f439b42518" path="/var/lib/kubelet/pods/ba05d20e-0cd7-4c1a-bee5-45f439b42518/volumes" Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.266808 4835 generic.go:334] "Generic (PLEG): container finished" podID="ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2" containerID="15f82a6c31b176d2f4801fab274b517447313fd33088f4d0a9cf54beec3eb9ac" exitCode=0 Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.266843 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mflzl" event={"ID":"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2","Type":"ContainerDied","Data":"15f82a6c31b176d2f4801fab274b517447313fd33088f4d0a9cf54beec3eb9ac"} Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.269597 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1c6c5b2c-c368-4237-83ec-18ae3d06fe61","Type":"ContainerStarted","Data":"7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13"} Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.689451 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-jfv8f" Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.730053 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x89b6\" (UniqueName: \"kubernetes.io/projected/8906174e-7e3e-4c75-916d-b9dc0428ed75-kube-api-access-x89b6\") pod \"8906174e-7e3e-4c75-916d-b9dc0428ed75\" (UID: \"8906174e-7e3e-4c75-916d-b9dc0428ed75\") " Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.746597 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8906174e-7e3e-4c75-916d-b9dc0428ed75-kube-api-access-x89b6" (OuterVolumeSpecName: "kube-api-access-x89b6") pod "8906174e-7e3e-4c75-916d-b9dc0428ed75" (UID: "8906174e-7e3e-4c75-916d-b9dc0428ed75"). InnerVolumeSpecName "kube-api-access-x89b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.827455 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n5gjl" Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.831921 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x89b6\" (UniqueName: \"kubernetes.io/projected/8906174e-7e3e-4c75-916d-b9dc0428ed75-kube-api-access-x89b6\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.833612 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2qv8l" Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.838887 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-676cv" Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.932738 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxtdx\" (UniqueName: \"kubernetes.io/projected/991a2cf0-ae6d-4c94-8bc2-f0f605077c13-kube-api-access-wxtdx\") pod \"991a2cf0-ae6d-4c94-8bc2-f0f605077c13\" (UID: \"991a2cf0-ae6d-4c94-8bc2-f0f605077c13\") " Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.932895 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwwdk\" (UniqueName: \"kubernetes.io/projected/ef013310-2986-4594-a8a2-f9d0f9f686df-kube-api-access-wwwdk\") pod \"ef013310-2986-4594-a8a2-f9d0f9f686df\" (UID: \"ef013310-2986-4594-a8a2-f9d0f9f686df\") " Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.932962 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ggz6\" (UniqueName: \"kubernetes.io/projected/7c3b62d6-8a47-48b7-8bb7-ffc031a67eef-kube-api-access-2ggz6\") pod \"7c3b62d6-8a47-48b7-8bb7-ffc031a67eef\" (UID: \"7c3b62d6-8a47-48b7-8bb7-ffc031a67eef\") " Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.936800 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef013310-2986-4594-a8a2-f9d0f9f686df-kube-api-access-wwwdk" (OuterVolumeSpecName: "kube-api-access-wwwdk") pod "ef013310-2986-4594-a8a2-f9d0f9f686df" (UID: "ef013310-2986-4594-a8a2-f9d0f9f686df"). InnerVolumeSpecName "kube-api-access-wwwdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.936840 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/991a2cf0-ae6d-4c94-8bc2-f0f605077c13-kube-api-access-wxtdx" (OuterVolumeSpecName: "kube-api-access-wxtdx") pod "991a2cf0-ae6d-4c94-8bc2-f0f605077c13" (UID: "991a2cf0-ae6d-4c94-8bc2-f0f605077c13"). InnerVolumeSpecName "kube-api-access-wxtdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:05 crc kubenswrapper[4835]: I1003 18:31:05.942263 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3b62d6-8a47-48b7-8bb7-ffc031a67eef-kube-api-access-2ggz6" (OuterVolumeSpecName: "kube-api-access-2ggz6") pod "7c3b62d6-8a47-48b7-8bb7-ffc031a67eef" (UID: "7c3b62d6-8a47-48b7-8bb7-ffc031a67eef"). InnerVolumeSpecName "kube-api-access-2ggz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.035469 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwwdk\" (UniqueName: \"kubernetes.io/projected/ef013310-2986-4594-a8a2-f9d0f9f686df-kube-api-access-wwwdk\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.035502 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ggz6\" (UniqueName: \"kubernetes.io/projected/7c3b62d6-8a47-48b7-8bb7-ffc031a67eef-kube-api-access-2ggz6\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.035512 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxtdx\" (UniqueName: \"kubernetes.io/projected/991a2cf0-ae6d-4c94-8bc2-f0f605077c13-kube-api-access-wxtdx\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.280672 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2qv8l" event={"ID":"ef013310-2986-4594-a8a2-f9d0f9f686df","Type":"ContainerDied","Data":"90c11c3ae82a42f8483e3c47a68c52aedfcc801918aba34d5501ff3858539251"} Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.280758 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90c11c3ae82a42f8483e3c47a68c52aedfcc801918aba34d5501ff3858539251" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.280807 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2qv8l" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.282294 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-n5gjl" event={"ID":"991a2cf0-ae6d-4c94-8bc2-f0f605077c13","Type":"ContainerDied","Data":"ee9101bf69f1937de6ba9baef81952650e9edeb033ac03a99c39adff72f7a430"} Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.282317 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee9101bf69f1937de6ba9baef81952650e9edeb033ac03a99c39adff72f7a430" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.282348 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-n5gjl" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.283639 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-676cv" event={"ID":"7c3b62d6-8a47-48b7-8bb7-ffc031a67eef","Type":"ContainerDied","Data":"75994ecf141da45d37df956571ede31b4e7893683546672c819773db5848ac18"} Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.283659 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75994ecf141da45d37df956571ede31b4e7893683546672c819773db5848ac18" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.283713 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-676cv" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.285601 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-jfv8f" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.285617 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-jfv8f" event={"ID":"8906174e-7e3e-4c75-916d-b9dc0428ed75","Type":"ContainerDied","Data":"1f9ea4fe606d8a33570b292eb59926c26de5222b6a532bd049a3c1ea272fc042"} Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.285664 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f9ea4fe606d8a33570b292eb59926c26de5222b6a532bd049a3c1ea272fc042" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.550830 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.644228 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-swiftconf\") pod \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.644306 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-dispersionconf\") pod \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.644350 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsdgw\" (UniqueName: \"kubernetes.io/projected/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-kube-api-access-hsdgw\") pod \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.644392 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-combined-ca-bundle\") pod \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.644451 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-scripts\") pod \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.644478 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-etc-swift\") pod \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.644530 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-ring-data-devices\") pod \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\" (UID: \"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2\") " Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.645621 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2" (UID: "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.647180 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2" (UID: "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.650384 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-kube-api-access-hsdgw" (OuterVolumeSpecName: "kube-api-access-hsdgw") pod "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2" (UID: "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2"). InnerVolumeSpecName "kube-api-access-hsdgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.654568 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2" (UID: "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.669185 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.672469 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2" (UID: "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.673537 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-scripts" (OuterVolumeSpecName: "scripts") pod "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2" (UID: "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.676351 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2" (UID: "ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.747347 4835 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.747382 4835 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.747395 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsdgw\" (UniqueName: \"kubernetes.io/projected/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-kube-api-access-hsdgw\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.747406 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.747418 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.747428 4835 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:06 crc kubenswrapper[4835]: I1003 18:31:06.747440 4835 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:07 crc kubenswrapper[4835]: I1003 18:31:07.294905 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mflzl" event={"ID":"ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2","Type":"ContainerDied","Data":"864087d9e2ff73ef29976fea1561a41ad860f3a5d3194f55001031973be060c0"} Oct 03 18:31:07 crc kubenswrapper[4835]: I1003 18:31:07.295236 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="864087d9e2ff73ef29976fea1561a41ad860f3a5d3194f55001031973be060c0" Oct 03 18:31:07 crc kubenswrapper[4835]: I1003 18:31:07.295005 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mflzl" Oct 03 18:31:08 crc kubenswrapper[4835]: I1003 18:31:08.304978 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1c6c5b2c-c368-4237-83ec-18ae3d06fe61","Type":"ContainerStarted","Data":"060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d"} Oct 03 18:31:08 crc kubenswrapper[4835]: I1003 18:31:08.328016 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=8.062649567 podStartE2EDuration="46.327995629s" podCreationTimestamp="2025-10-03 18:30:22 +0000 UTC" firstStartedPulling="2025-10-03 18:30:29.667288788 +0000 UTC m=+971.383229660" lastFinishedPulling="2025-10-03 18:31:07.93263483 +0000 UTC m=+1009.648575722" observedRunningTime="2025-10-03 18:31:08.324860462 +0000 UTC m=+1010.040801344" watchObservedRunningTime="2025-10-03 18:31:08.327995629 +0000 UTC m=+1010.043936501" Oct 03 18:31:08 crc kubenswrapper[4835]: I1003 18:31:08.728493 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:08 crc kubenswrapper[4835]: I1003 18:31:08.728559 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:08 crc kubenswrapper[4835]: I1003 18:31:08.731900 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:09 crc kubenswrapper[4835]: I1003 18:31:09.311581 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:09 crc kubenswrapper[4835]: I1003 18:31:09.401034 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:31:09 crc kubenswrapper[4835]: I1003 18:31:09.414436 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8359224-c77a-4d86-878b-6f073225ed33-etc-swift\") pod \"swift-storage-0\" (UID: \"c8359224-c77a-4d86-878b-6f073225ed33\") " pod="openstack/swift-storage-0" Oct 03 18:31:09 crc kubenswrapper[4835]: I1003 18:31:09.473315 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 18:31:10 crc kubenswrapper[4835]: I1003 18:31:10.081796 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 18:31:10 crc kubenswrapper[4835]: I1003 18:31:10.319916 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"79778e305e915096022370c95ac2ed74f005d5dc30126f868d51f53b44f7b9a6"} Oct 03 18:31:11 crc kubenswrapper[4835]: I1003 18:31:11.333582 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"e3bdb9e5045a6720d5c8746b24d8232786de3a7b3b41d9b289ca3e64711616dd"} Oct 03 18:31:11 crc kubenswrapper[4835]: I1003 18:31:11.333966 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"b145709613167047d1580161e763c78f388109d55ed73a18f5413b4188ee3d23"} Oct 03 18:31:11 crc kubenswrapper[4835]: I1003 18:31:11.333983 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"59d1856d15977fe0921616581d1a1894d516401b4b45f95f3ae5ba6fa810d7f9"} Oct 03 18:31:11 crc kubenswrapper[4835]: I1003 18:31:11.333994 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"9d83673772742431392530d64dbcf37a4d7a00e37a8e8a348dc2b91241eabd24"} Oct 03 18:31:11 crc kubenswrapper[4835]: I1003 18:31:11.381933 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.339195 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="prometheus" containerID="cri-o://07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a" gracePeriod=600 Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.339257 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="thanos-sidecar" containerID="cri-o://060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d" gracePeriod=600 Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.339266 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="config-reloader" containerID="cri-o://7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13" gracePeriod=600 Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.410413 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-8132-account-create-xwj7q"] Oct 03 18:31:12 crc kubenswrapper[4835]: E1003 18:31:12.410966 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef013310-2986-4594-a8a2-f9d0f9f686df" containerName="mariadb-database-create" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.410983 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef013310-2986-4594-a8a2-f9d0f9f686df" containerName="mariadb-database-create" Oct 03 18:31:12 crc kubenswrapper[4835]: E1003 18:31:12.410998 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba05d20e-0cd7-4c1a-bee5-45f439b42518" containerName="init" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.411005 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba05d20e-0cd7-4c1a-bee5-45f439b42518" containerName="init" Oct 03 18:31:12 crc kubenswrapper[4835]: E1003 18:31:12.411019 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2" containerName="swift-ring-rebalance" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.411025 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2" containerName="swift-ring-rebalance" Oct 03 18:31:12 crc kubenswrapper[4835]: E1003 18:31:12.411035 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8906174e-7e3e-4c75-916d-b9dc0428ed75" containerName="mariadb-database-create" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.411041 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8906174e-7e3e-4c75-916d-b9dc0428ed75" containerName="mariadb-database-create" Oct 03 18:31:12 crc kubenswrapper[4835]: E1003 18:31:12.411056 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba05d20e-0cd7-4c1a-bee5-45f439b42518" containerName="dnsmasq-dns" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.411061 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba05d20e-0cd7-4c1a-bee5-45f439b42518" containerName="dnsmasq-dns" Oct 03 18:31:12 crc kubenswrapper[4835]: E1003 18:31:12.411136 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3b62d6-8a47-48b7-8bb7-ffc031a67eef" containerName="mariadb-database-create" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.411144 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3b62d6-8a47-48b7-8bb7-ffc031a67eef" containerName="mariadb-database-create" Oct 03 18:31:12 crc kubenswrapper[4835]: E1003 18:31:12.411155 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991a2cf0-ae6d-4c94-8bc2-f0f605077c13" containerName="mariadb-database-create" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.411160 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="991a2cf0-ae6d-4c94-8bc2-f0f605077c13" containerName="mariadb-database-create" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.411323 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef013310-2986-4594-a8a2-f9d0f9f686df" containerName="mariadb-database-create" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.411340 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3b62d6-8a47-48b7-8bb7-ffc031a67eef" containerName="mariadb-database-create" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.411352 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="991a2cf0-ae6d-4c94-8bc2-f0f605077c13" containerName="mariadb-database-create" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.411377 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba05d20e-0cd7-4c1a-bee5-45f439b42518" containerName="dnsmasq-dns" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.411390 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2" containerName="swift-ring-rebalance" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.411401 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8906174e-7e3e-4c75-916d-b9dc0428ed75" containerName="mariadb-database-create" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.412006 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8132-account-create-xwj7q" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.413940 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.467824 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdvz\" (UniqueName: \"kubernetes.io/projected/a1f1068f-9964-41d4-909d-19cc6c035a73-kube-api-access-thdvz\") pod \"watcher-8132-account-create-xwj7q\" (UID: \"a1f1068f-9964-41d4-909d-19cc6c035a73\") " pod="openstack/watcher-8132-account-create-xwj7q" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.470120 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-8132-account-create-xwj7q"] Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.569566 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thdvz\" (UniqueName: \"kubernetes.io/projected/a1f1068f-9964-41d4-909d-19cc6c035a73-kube-api-access-thdvz\") pod \"watcher-8132-account-create-xwj7q\" (UID: \"a1f1068f-9964-41d4-909d-19cc6c035a73\") " pod="openstack/watcher-8132-account-create-xwj7q" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.587263 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdvz\" (UniqueName: \"kubernetes.io/projected/a1f1068f-9964-41d4-909d-19cc6c035a73-kube-api-access-thdvz\") pod \"watcher-8132-account-create-xwj7q\" (UID: \"a1f1068f-9964-41d4-909d-19cc6c035a73\") " pod="openstack/watcher-8132-account-create-xwj7q" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.821461 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8132-account-create-xwj7q" Oct 03 18:31:12 crc kubenswrapper[4835]: I1003 18:31:12.940122 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.083548 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-prometheus-metric-storage-rulefiles-0\") pod \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.083873 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-web-config\") pod \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.084069 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.084130 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2jbm\" (UniqueName: \"kubernetes.io/projected/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-kube-api-access-s2jbm\") pod \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.084215 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-config\") pod \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.084287 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-config-out\") pod \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.084302 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-tls-assets\") pod \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.084330 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-thanos-prometheus-http-client-file\") pod \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\" (UID: \"1c6c5b2c-c368-4237-83ec-18ae3d06fe61\") " Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.089600 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "1c6c5b2c-c368-4237-83ec-18ae3d06fe61" (UID: "1c6c5b2c-c368-4237-83ec-18ae3d06fe61"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.100236 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1c6c5b2c-c368-4237-83ec-18ae3d06fe61" (UID: "1c6c5b2c-c368-4237-83ec-18ae3d06fe61"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.111236 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-config-out" (OuterVolumeSpecName: "config-out") pod "1c6c5b2c-c368-4237-83ec-18ae3d06fe61" (UID: "1c6c5b2c-c368-4237-83ec-18ae3d06fe61"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.120255 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-config" (OuterVolumeSpecName: "config") pod "1c6c5b2c-c368-4237-83ec-18ae3d06fe61" (UID: "1c6c5b2c-c368-4237-83ec-18ae3d06fe61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.126668 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1c6c5b2c-c368-4237-83ec-18ae3d06fe61" (UID: "1c6c5b2c-c368-4237-83ec-18ae3d06fe61"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.126742 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-kube-api-access-s2jbm" (OuterVolumeSpecName: "kube-api-access-s2jbm") pod "1c6c5b2c-c368-4237-83ec-18ae3d06fe61" (UID: "1c6c5b2c-c368-4237-83ec-18ae3d06fe61"). InnerVolumeSpecName "kube-api-access-s2jbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.182469 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-web-config" (OuterVolumeSpecName: "web-config") pod "1c6c5b2c-c368-4237-83ec-18ae3d06fe61" (UID: "1c6c5b2c-c368-4237-83ec-18ae3d06fe61"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.186312 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.186343 4835 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.186356 4835 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-config-out\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.186368 4835 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.186383 4835 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.186396 4835 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-web-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.186407 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2jbm\" (UniqueName: \"kubernetes.io/projected/1c6c5b2c-c368-4237-83ec-18ae3d06fe61-kube-api-access-s2jbm\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.196174 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "1c6c5b2c-c368-4237-83ec-18ae3d06fe61" (UID: "1c6c5b2c-c368-4237-83ec-18ae3d06fe61"). InnerVolumeSpecName "pvc-bed34091-a921-4906-9121-f482ec67e99a". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.284785 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-8132-account-create-xwj7q"] Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.287802 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") on node \"crc\" " Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.324476 4835 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.324743 4835 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-bed34091-a921-4906-9121-f482ec67e99a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a") on node "crc" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.358923 4835 generic.go:334] "Generic (PLEG): container finished" podID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerID="060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d" exitCode=0 Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.358966 4835 generic.go:334] "Generic (PLEG): container finished" podID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerID="7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13" exitCode=0 Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.358980 4835 generic.go:334] "Generic (PLEG): container finished" podID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerID="07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a" exitCode=0 Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.359029 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1c6c5b2c-c368-4237-83ec-18ae3d06fe61","Type":"ContainerDied","Data":"060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d"} Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.359062 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1c6c5b2c-c368-4237-83ec-18ae3d06fe61","Type":"ContainerDied","Data":"7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13"} Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.359097 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1c6c5b2c-c368-4237-83ec-18ae3d06fe61","Type":"ContainerDied","Data":"07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a"} Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.359111 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1c6c5b2c-c368-4237-83ec-18ae3d06fe61","Type":"ContainerDied","Data":"b798453bcd08208fc60fcfb773c694f424592fedd19f0134ed4d271d4ea56a9a"} Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.359132 4835 scope.go:117] "RemoveContainer" containerID="060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.359295 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.376906 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"91e98e02e30feed3e39d9b0c6e07b226a646d9a82e7e1df568de0c85c4b07955"} Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.376957 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"3f8a344d7037b378952c46e0e6bd68a3a1fa3ae1152ebd6ab23bfc6686c6f86f"} Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.376969 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"6719bdc3fe9bd473bdeb01c252e8a25137bb2253e7a5a2c0a0b6d017ee78d593"} Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.378683 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-8132-account-create-xwj7q" event={"ID":"a1f1068f-9964-41d4-909d-19cc6c035a73","Type":"ContainerStarted","Data":"9f72837d927e1595203ee1fd4951a8f53b69f9e84f6f972feb93ce78f46d08ff"} Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.390557 4835 reconciler_common.go:293] "Volume detached for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.406275 4835 scope.go:117] "RemoveContainer" containerID="7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.413360 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.424543 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.435324 4835 scope.go:117] "RemoveContainer" containerID="07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.453888 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 18:31:13 crc kubenswrapper[4835]: E1003 18:31:13.460401 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="thanos-sidecar" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.460441 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="thanos-sidecar" Oct 03 18:31:13 crc kubenswrapper[4835]: E1003 18:31:13.460487 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="prometheus" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.460495 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="prometheus" Oct 03 18:31:13 crc kubenswrapper[4835]: E1003 18:31:13.460511 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="config-reloader" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.460519 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="config-reloader" Oct 03 18:31:13 crc kubenswrapper[4835]: E1003 18:31:13.460531 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="init-config-reloader" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.460538 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="init-config-reloader" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.462234 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="prometheus" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.462259 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="config-reloader" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.462273 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" containerName="thanos-sidecar" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.464848 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.464935 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.470044 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.470302 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.470432 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.470535 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.470635 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-9kc6v" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.470832 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.479808 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.511484 4835 scope.go:117] "RemoveContainer" containerID="7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.544488 4835 scope.go:117] "RemoveContainer" containerID="060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d" Oct 03 18:31:13 crc kubenswrapper[4835]: E1003 18:31:13.545052 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d\": container with ID starting with 060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d not found: ID does not exist" containerID="060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.545108 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d"} err="failed to get container status \"060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d\": rpc error: code = NotFound desc = could not find container \"060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d\": container with ID starting with 060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d not found: ID does not exist" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.545135 4835 scope.go:117] "RemoveContainer" containerID="7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13" Oct 03 18:31:13 crc kubenswrapper[4835]: E1003 18:31:13.545404 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13\": container with ID starting with 7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13 not found: ID does not exist" containerID="7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.545445 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13"} err="failed to get container status \"7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13\": rpc error: code = NotFound desc = could not find container \"7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13\": container with ID starting with 7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13 not found: ID does not exist" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.545459 4835 scope.go:117] "RemoveContainer" containerID="07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a" Oct 03 18:31:13 crc kubenswrapper[4835]: E1003 18:31:13.546102 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a\": container with ID starting with 07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a not found: ID does not exist" containerID="07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.546131 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a"} err="failed to get container status \"07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a\": rpc error: code = NotFound desc = could not find container \"07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a\": container with ID starting with 07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a not found: ID does not exist" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.546150 4835 scope.go:117] "RemoveContainer" containerID="7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175" Oct 03 18:31:13 crc kubenswrapper[4835]: E1003 18:31:13.546416 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175\": container with ID starting with 7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175 not found: ID does not exist" containerID="7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.546458 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175"} err="failed to get container status \"7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175\": rpc error: code = NotFound desc = could not find container \"7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175\": container with ID starting with 7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175 not found: ID does not exist" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.546475 4835 scope.go:117] "RemoveContainer" containerID="060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.550097 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d"} err="failed to get container status \"060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d\": rpc error: code = NotFound desc = could not find container \"060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d\": container with ID starting with 060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d not found: ID does not exist" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.550151 4835 scope.go:117] "RemoveContainer" containerID="7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.553465 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13"} err="failed to get container status \"7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13\": rpc error: code = NotFound desc = could not find container \"7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13\": container with ID starting with 7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13 not found: ID does not exist" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.553525 4835 scope.go:117] "RemoveContainer" containerID="07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.553864 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a"} err="failed to get container status \"07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a\": rpc error: code = NotFound desc = could not find container \"07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a\": container with ID starting with 07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a not found: ID does not exist" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.553889 4835 scope.go:117] "RemoveContainer" containerID="7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.555088 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175"} err="failed to get container status \"7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175\": rpc error: code = NotFound desc = could not find container \"7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175\": container with ID starting with 7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175 not found: ID does not exist" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.555119 4835 scope.go:117] "RemoveContainer" containerID="060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.555418 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d"} err="failed to get container status \"060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d\": rpc error: code = NotFound desc = could not find container \"060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d\": container with ID starting with 060f5ca4503f8c5c63a21d12cc767de39e482912834bb16a16c606807c44140d not found: ID does not exist" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.555522 4835 scope.go:117] "RemoveContainer" containerID="7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.556216 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13"} err="failed to get container status \"7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13\": rpc error: code = NotFound desc = could not find container \"7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13\": container with ID starting with 7b2973ecc1c61a0e4dc4eecdadb7b47a5c3ca707c74c5fe0e8b83ff108220b13 not found: ID does not exist" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.556245 4835 scope.go:117] "RemoveContainer" containerID="07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.556611 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a"} err="failed to get container status \"07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a\": rpc error: code = NotFound desc = could not find container \"07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a\": container with ID starting with 07ce5244e2501ca7952364a72d50f3cb89a5b6f975e88589c4c0082faf15a56a not found: ID does not exist" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.556641 4835 scope.go:117] "RemoveContainer" containerID="7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.556905 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175"} err="failed to get container status \"7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175\": rpc error: code = NotFound desc = could not find container \"7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175\": container with ID starting with 7115c45303b695eb21f653f3cc77c8fe160d9d369106307704548b0f7be7d175 not found: ID does not exist" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.594119 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.594458 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.594483 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-config\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.594510 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.594539 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlcm5\" (UniqueName: \"kubernetes.io/projected/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-kube-api-access-hlcm5\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.594561 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.594585 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.594629 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.594648 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.594665 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.594684 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.696536 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.696617 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlcm5\" (UniqueName: \"kubernetes.io/projected/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-kube-api-access-hlcm5\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.696650 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.696685 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.696747 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.696775 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.696796 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.696825 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.696888 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.696924 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.696942 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-config\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.697690 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.701946 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.702661 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.703327 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.703358 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/88bcb9a338b078b35fe2aaa6fcd1ca51c30be9164778c602eed472976adc1b23/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.706749 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-config\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.706782 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.710657 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.710677 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.710703 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.714996 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.718617 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlcm5\" (UniqueName: \"kubernetes.io/projected/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-kube-api-access-hlcm5\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.758646 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"prometheus-metric-storage-0\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:13 crc kubenswrapper[4835]: I1003 18:31:13.812580 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:14 crc kubenswrapper[4835]: I1003 18:31:14.387819 4835 generic.go:334] "Generic (PLEG): container finished" podID="a1f1068f-9964-41d4-909d-19cc6c035a73" containerID="4b3da114b9ee0fa0989b751a5259b35b3d3cb7246905ed123e47ecc35e62d80d" exitCode=0 Oct 03 18:31:14 crc kubenswrapper[4835]: I1003 18:31:14.388229 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-8132-account-create-xwj7q" event={"ID":"a1f1068f-9964-41d4-909d-19cc6c035a73","Type":"ContainerDied","Data":"4b3da114b9ee0fa0989b751a5259b35b3d3cb7246905ed123e47ecc35e62d80d"} Oct 03 18:31:14 crc kubenswrapper[4835]: I1003 18:31:14.394711 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"099a2da024fc58f852059d8adaf9af1bcccffdceb1dd2eb82ec18dc709136a21"} Oct 03 18:31:14 crc kubenswrapper[4835]: I1003 18:31:14.394744 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"507575ff185e579e3dd0cb0a2acec483239aa0550785f74ea5b3729584b77a0a"} Oct 03 18:31:14 crc kubenswrapper[4835]: I1003 18:31:14.394754 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"f253d0bfe03b0bbdbd88e0f318260a269996573e6ffee8b300c82ea2e957bf66"} Oct 03 18:31:14 crc kubenswrapper[4835]: I1003 18:31:14.481642 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 18:31:14 crc kubenswrapper[4835]: W1003 18:31:14.492667 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33e000bd_5fc2_4a35_97f2_5fd3c1493f1c.slice/crio-555176beb785054d0479654e50dd5ccdf87d05b9cb0171d60589123f4a69305d WatchSource:0}: Error finding container 555176beb785054d0479654e50dd5ccdf87d05b9cb0171d60589123f4a69305d: Status 404 returned error can't find the container with id 555176beb785054d0479654e50dd5ccdf87d05b9cb0171d60589123f4a69305d Oct 03 18:31:14 crc kubenswrapper[4835]: I1003 18:31:14.895375 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c6c5b2c-c368-4237-83ec-18ae3d06fe61" path="/var/lib/kubelet/pods/1c6c5b2c-c368-4237-83ec-18ae3d06fe61/volumes" Oct 03 18:31:15 crc kubenswrapper[4835]: I1003 18:31:15.407296 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"8bd102332f6f53b940afa39f18ed2c205008a090737391dcb8ddb4d34a7deaf4"} Oct 03 18:31:15 crc kubenswrapper[4835]: I1003 18:31:15.408227 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"05ed822143832ee2695135fcd8d44c423b07a0fd24b4922118d43c748c59f112"} Oct 03 18:31:15 crc kubenswrapper[4835]: I1003 18:31:15.408307 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"616d7ce5fe6da85235002aec440d99c736b29e5e33572513c938816d821cdb60"} Oct 03 18:31:15 crc kubenswrapper[4835]: I1003 18:31:15.408386 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"7228ce68e64057d57e7d4f43de8562069bbe215519a1df7eb0ccf89de67c0daf"} Oct 03 18:31:15 crc kubenswrapper[4835]: I1003 18:31:15.408815 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c","Type":"ContainerStarted","Data":"555176beb785054d0479654e50dd5ccdf87d05b9cb0171d60589123f4a69305d"} Oct 03 18:31:15 crc kubenswrapper[4835]: I1003 18:31:15.743292 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8132-account-create-xwj7q" Oct 03 18:31:15 crc kubenswrapper[4835]: I1003 18:31:15.830042 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thdvz\" (UniqueName: \"kubernetes.io/projected/a1f1068f-9964-41d4-909d-19cc6c035a73-kube-api-access-thdvz\") pod \"a1f1068f-9964-41d4-909d-19cc6c035a73\" (UID: \"a1f1068f-9964-41d4-909d-19cc6c035a73\") " Oct 03 18:31:15 crc kubenswrapper[4835]: I1003 18:31:15.835556 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1f1068f-9964-41d4-909d-19cc6c035a73-kube-api-access-thdvz" (OuterVolumeSpecName: "kube-api-access-thdvz") pod "a1f1068f-9964-41d4-909d-19cc6c035a73" (UID: "a1f1068f-9964-41d4-909d-19cc6c035a73"). InnerVolumeSpecName "kube-api-access-thdvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:15 crc kubenswrapper[4835]: I1003 18:31:15.898256 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-plxn4" podUID="c9280b95-ef96-4c58-948f-2abcd7ad8a25" containerName="ovn-controller" probeResult="failure" output=< Oct 03 18:31:15 crc kubenswrapper[4835]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 18:31:15 crc kubenswrapper[4835]: > Oct 03 18:31:15 crc kubenswrapper[4835]: I1003 18:31:15.932382 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thdvz\" (UniqueName: \"kubernetes.io/projected/a1f1068f-9964-41d4-909d-19cc6c035a73-kube-api-access-thdvz\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.423422 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8359224-c77a-4d86-878b-6f073225ed33","Type":"ContainerStarted","Data":"854d7f082ae3b3b2f4af3ecd3a64d0a83b5e06115bb5a82be27b04b2050d0728"} Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.424673 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-8132-account-create-xwj7q" event={"ID":"a1f1068f-9964-41d4-909d-19cc6c035a73","Type":"ContainerDied","Data":"9f72837d927e1595203ee1fd4951a8f53b69f9e84f6f972feb93ce78f46d08ff"} Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.424702 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f72837d927e1595203ee1fd4951a8f53b69f9e84f6f972feb93ce78f46d08ff" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.424708 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-8132-account-create-xwj7q" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.453883 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.487918811 podStartE2EDuration="24.453862705s" podCreationTimestamp="2025-10-03 18:30:52 +0000 UTC" firstStartedPulling="2025-10-03 18:31:10.091971882 +0000 UTC m=+1011.807912754" lastFinishedPulling="2025-10-03 18:31:14.057915776 +0000 UTC m=+1015.773856648" observedRunningTime="2025-10-03 18:31:16.451112677 +0000 UTC m=+1018.167053579" watchObservedRunningTime="2025-10-03 18:31:16.453862705 +0000 UTC m=+1018.169803587" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.703449 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c66c75b9c-6v2j7"] Oct 03 18:31:16 crc kubenswrapper[4835]: E1003 18:31:16.703782 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f1068f-9964-41d4-909d-19cc6c035a73" containerName="mariadb-account-create" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.703800 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f1068f-9964-41d4-909d-19cc6c035a73" containerName="mariadb-account-create" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.703949 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1f1068f-9964-41d4-909d-19cc6c035a73" containerName="mariadb-account-create" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.704845 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.707473 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.717573 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c66c75b9c-6v2j7"] Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.743548 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-ovsdbserver-sb\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.743634 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-dns-swift-storage-0\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.743882 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jk6k\" (UniqueName: \"kubernetes.io/projected/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-kube-api-access-5jk6k\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.743950 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-dns-svc\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.744077 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-config\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.744135 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-ovsdbserver-nb\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.845638 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-config\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.845696 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-ovsdbserver-nb\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.845746 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-ovsdbserver-sb\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.845798 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-dns-swift-storage-0\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.845843 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jk6k\" (UniqueName: \"kubernetes.io/projected/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-kube-api-access-5jk6k\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.845892 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-dns-svc\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.846533 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-config\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.846552 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-dns-svc\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.847154 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-ovsdbserver-nb\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.847229 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-ovsdbserver-sb\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.847342 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-dns-swift-storage-0\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:16 crc kubenswrapper[4835]: I1003 18:31:16.921602 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jk6k\" (UniqueName: \"kubernetes.io/projected/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-kube-api-access-5jk6k\") pod \"dnsmasq-dns-5c66c75b9c-6v2j7\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:17 crc kubenswrapper[4835]: I1003 18:31:17.039460 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:17 crc kubenswrapper[4835]: I1003 18:31:17.432301 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c","Type":"ContainerStarted","Data":"93b2837e510576d8bb572b89f16db0212e746865c19802cde66953cd8f7661d8"} Oct 03 18:31:17 crc kubenswrapper[4835]: I1003 18:31:17.468524 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c66c75b9c-6v2j7"] Oct 03 18:31:17 crc kubenswrapper[4835]: W1003 18:31:17.476238 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e4ebb1c_34bd_44d8_a7c8_a69cb4856406.slice/crio-4528d4c1f22a5591e42782b7b1d3c97c83a9364fffb62d55ee5ca95a5ccbbff7 WatchSource:0}: Error finding container 4528d4c1f22a5591e42782b7b1d3c97c83a9364fffb62d55ee5ca95a5ccbbff7: Status 404 returned error can't find the container with id 4528d4c1f22a5591e42782b7b1d3c97c83a9364fffb62d55ee5ca95a5ccbbff7 Oct 03 18:31:18 crc kubenswrapper[4835]: I1003 18:31:18.441537 4835 generic.go:334] "Generic (PLEG): container finished" podID="9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" containerID="09637d74d03dd5d26157f5fb908c8c2ddcdc1d2babcc7bbb74a92863ac5f3901" exitCode=0 Oct 03 18:31:18 crc kubenswrapper[4835]: I1003 18:31:18.441587 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" event={"ID":"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406","Type":"ContainerDied","Data":"09637d74d03dd5d26157f5fb908c8c2ddcdc1d2babcc7bbb74a92863ac5f3901"} Oct 03 18:31:18 crc kubenswrapper[4835]: I1003 18:31:18.441863 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" event={"ID":"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406","Type":"ContainerStarted","Data":"4528d4c1f22a5591e42782b7b1d3c97c83a9364fffb62d55ee5ca95a5ccbbff7"} Oct 03 18:31:19 crc kubenswrapper[4835]: I1003 18:31:19.450846 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" event={"ID":"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406","Type":"ContainerStarted","Data":"7c61bc02b23bcd9fc6866001caa8e97586079c3bb9b456496ea093cc4186e4c0"} Oct 03 18:31:19 crc kubenswrapper[4835]: I1003 18:31:19.451020 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:19 crc kubenswrapper[4835]: I1003 18:31:19.468258 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" podStartSLOduration=3.468242586 podStartE2EDuration="3.468242586s" podCreationTimestamp="2025-10-03 18:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:31:19.466110063 +0000 UTC m=+1021.182050935" watchObservedRunningTime="2025-10-03 18:31:19.468242586 +0000 UTC m=+1021.184183458" Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.489437 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-41e5-account-create-zlkgq"] Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.490821 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41e5-account-create-zlkgq" Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.493849 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.512910 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-41e5-account-create-zlkgq"] Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.602532 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4rjs\" (UniqueName: \"kubernetes.io/projected/5816f3db-fcba-4c2f-872e-0bdf53f7e7df-kube-api-access-r4rjs\") pod \"keystone-41e5-account-create-zlkgq\" (UID: \"5816f3db-fcba-4c2f-872e-0bdf53f7e7df\") " pod="openstack/keystone-41e5-account-create-zlkgq" Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.688410 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-dd46-account-create-j255n"] Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.689530 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dd46-account-create-j255n" Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.694458 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-dd46-account-create-j255n"] Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.736279 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kh42\" (UniqueName: \"kubernetes.io/projected/9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b-kube-api-access-6kh42\") pod \"placement-dd46-account-create-j255n\" (UID: \"9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b\") " pod="openstack/placement-dd46-account-create-j255n" Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.736377 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4rjs\" (UniqueName: \"kubernetes.io/projected/5816f3db-fcba-4c2f-872e-0bdf53f7e7df-kube-api-access-r4rjs\") pod \"keystone-41e5-account-create-zlkgq\" (UID: \"5816f3db-fcba-4c2f-872e-0bdf53f7e7df\") " pod="openstack/keystone-41e5-account-create-zlkgq" Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.738370 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.758259 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4rjs\" (UniqueName: \"kubernetes.io/projected/5816f3db-fcba-4c2f-872e-0bdf53f7e7df-kube-api-access-r4rjs\") pod \"keystone-41e5-account-create-zlkgq\" (UID: \"5816f3db-fcba-4c2f-872e-0bdf53f7e7df\") " pod="openstack/keystone-41e5-account-create-zlkgq" Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.813902 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41e5-account-create-zlkgq" Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.838015 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kh42\" (UniqueName: \"kubernetes.io/projected/9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b-kube-api-access-6kh42\") pod \"placement-dd46-account-create-j255n\" (UID: \"9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b\") " pod="openstack/placement-dd46-account-create-j255n" Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.855367 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kh42\" (UniqueName: \"kubernetes.io/projected/9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b-kube-api-access-6kh42\") pod \"placement-dd46-account-create-j255n\" (UID: \"9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b\") " pod="openstack/placement-dd46-account-create-j255n" Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.905981 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-plxn4" podUID="c9280b95-ef96-4c58-948f-2abcd7ad8a25" containerName="ovn-controller" probeResult="failure" output=< Oct 03 18:31:20 crc kubenswrapper[4835]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 18:31:20 crc kubenswrapper[4835]: > Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.917847 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dd46-account-create-j255n" Oct 03 18:31:20 crc kubenswrapper[4835]: I1003 18:31:20.956184 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.000654 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gkx58" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.011894 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c92d-account-create-2tkrd"] Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.013176 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c92d-account-create-2tkrd" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.017446 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.023201 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c92d-account-create-2tkrd"] Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.145159 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zwgt\" (UniqueName: \"kubernetes.io/projected/31260cd4-00fe-424d-9111-32de1fcae207-kube-api-access-6zwgt\") pod \"glance-c92d-account-create-2tkrd\" (UID: \"31260cd4-00fe-424d-9111-32de1fcae207\") " pod="openstack/glance-c92d-account-create-2tkrd" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.214851 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-plxn4-config-vlk78"] Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.215971 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.218590 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.228726 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-plxn4-config-vlk78"] Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.247455 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zwgt\" (UniqueName: \"kubernetes.io/projected/31260cd4-00fe-424d-9111-32de1fcae207-kube-api-access-6zwgt\") pod \"glance-c92d-account-create-2tkrd\" (UID: \"31260cd4-00fe-424d-9111-32de1fcae207\") " pod="openstack/glance-c92d-account-create-2tkrd" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.270496 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zwgt\" (UniqueName: \"kubernetes.io/projected/31260cd4-00fe-424d-9111-32de1fcae207-kube-api-access-6zwgt\") pod \"glance-c92d-account-create-2tkrd\" (UID: \"31260cd4-00fe-424d-9111-32de1fcae207\") " pod="openstack/glance-c92d-account-create-2tkrd" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.281665 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-41e5-account-create-zlkgq"] Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.347642 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c92d-account-create-2tkrd" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.348909 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-log-ovn\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.349370 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-scripts\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.349485 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-additional-scripts\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.349630 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx8sj\" (UniqueName: \"kubernetes.io/projected/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-kube-api-access-wx8sj\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.349674 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-run\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.349746 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-run-ovn\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: W1003 18:31:21.373493 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dfe15e4_e9d4_45cb_a7ce_9377e0666f7b.slice/crio-18b487200a46a0375fe7dbd97c41d4d54c283a206cd7f26c564b21af3933d71f WatchSource:0}: Error finding container 18b487200a46a0375fe7dbd97c41d4d54c283a206cd7f26c564b21af3933d71f: Status 404 returned error can't find the container with id 18b487200a46a0375fe7dbd97c41d4d54c283a206cd7f26c564b21af3933d71f Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.374808 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-dd46-account-create-j255n"] Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.451265 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx8sj\" (UniqueName: \"kubernetes.io/projected/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-kube-api-access-wx8sj\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.451571 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-run\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.451657 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-run-ovn\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.451692 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-log-ovn\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.451731 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-scripts\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.451774 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-additional-scripts\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.451969 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-log-ovn\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.451988 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-run-ovn\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.452843 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-additional-scripts\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.453958 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-scripts\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.454090 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-run\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.470787 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx8sj\" (UniqueName: \"kubernetes.io/projected/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-kube-api-access-wx8sj\") pod \"ovn-controller-plxn4-config-vlk78\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.497893 4835 generic.go:334] "Generic (PLEG): container finished" podID="6fd26bdb-868b-49db-9698-e7c79eea5cef" containerID="82cdd552bd8994820289189a6ad58b7da0d0849b9d8048609dea0a6514c0530d" exitCode=0 Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.497975 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6fd26bdb-868b-49db-9698-e7c79eea5cef","Type":"ContainerDied","Data":"82cdd552bd8994820289189a6ad58b7da0d0849b9d8048609dea0a6514c0530d"} Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.530907 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.537104 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-41e5-account-create-zlkgq" event={"ID":"5816f3db-fcba-4c2f-872e-0bdf53f7e7df","Type":"ContainerStarted","Data":"5ada613162cff0cebd48cff9da0bb102dfd8a8bc1ef58c7d504870a3bc5e235f"} Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.537155 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-41e5-account-create-zlkgq" event={"ID":"5816f3db-fcba-4c2f-872e-0bdf53f7e7df","Type":"ContainerStarted","Data":"3d25b7496f5042748dd0953617c09da7711c8ea7538ba998082fe6fcdcb691fd"} Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.540744 4835 generic.go:334] "Generic (PLEG): container finished" podID="b17ce629-9abd-42ba-8004-cc4b85cee405" containerID="5a870ffee6244949225426c57f6551b5424c7b1591920c9a5feadf9e78b8050c" exitCode=0 Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.540789 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"b17ce629-9abd-42ba-8004-cc4b85cee405","Type":"ContainerDied","Data":"5a870ffee6244949225426c57f6551b5424c7b1591920c9a5feadf9e78b8050c"} Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.548442 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dd46-account-create-j255n" event={"ID":"9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b","Type":"ContainerStarted","Data":"18b487200a46a0375fe7dbd97c41d4d54c283a206cd7f26c564b21af3933d71f"} Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.557450 4835 generic.go:334] "Generic (PLEG): container finished" podID="2f5f99aa-dba6-465b-866a-1e293ba51685" containerID="6073b92c5fb7502902c8d7c4767ef2d79902b9a0b6bebdbfdacd3f52f2b31641" exitCode=0 Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.557537 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2f5f99aa-dba6-465b-866a-1e293ba51685","Type":"ContainerDied","Data":"6073b92c5fb7502902c8d7c4767ef2d79902b9a0b6bebdbfdacd3f52f2b31641"} Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.796940 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c92d-account-create-2tkrd"] Oct 03 18:31:21 crc kubenswrapper[4835]: W1003 18:31:21.809296 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31260cd4_00fe_424d_9111_32de1fcae207.slice/crio-3bfbcfe397d46d7314e034c57a50577aad1a7bbc7a2db4c1c23bd7cc4dc64125 WatchSource:0}: Error finding container 3bfbcfe397d46d7314e034c57a50577aad1a7bbc7a2db4c1c23bd7cc4dc64125: Status 404 returned error can't find the container with id 3bfbcfe397d46d7314e034c57a50577aad1a7bbc7a2db4c1c23bd7cc4dc64125 Oct 03 18:31:21 crc kubenswrapper[4835]: E1003 18:31:21.831001 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dfe15e4_e9d4_45cb_a7ce_9377e0666f7b.slice/crio-490642377d84e04f9b33a361c5af3b4470568cbcb85ec0f487211ece75e70252.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dfe15e4_e9d4_45cb_a7ce_9377e0666f7b.slice/crio-conmon-490642377d84e04f9b33a361c5af3b4470568cbcb85ec0f487211ece75e70252.scope\": RecentStats: unable to find data in memory cache]" Oct 03 18:31:21 crc kubenswrapper[4835]: I1003 18:31:21.992478 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-plxn4-config-vlk78"] Oct 03 18:31:21 crc kubenswrapper[4835]: W1003 18:31:21.992618 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30caa8b9_b964_4ea7_b39e_2e61a7e9935b.slice/crio-dd7f46912c2ec538c6f2eb502a0d4f411e40922886e63a530e8d8d42893efdcd WatchSource:0}: Error finding container dd7f46912c2ec538c6f2eb502a0d4f411e40922886e63a530e8d8d42893efdcd: Status 404 returned error can't find the container with id dd7f46912c2ec538c6f2eb502a0d4f411e40922886e63a530e8d8d42893efdcd Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.567648 4835 generic.go:334] "Generic (PLEG): container finished" podID="9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b" containerID="490642377d84e04f9b33a361c5af3b4470568cbcb85ec0f487211ece75e70252" exitCode=0 Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.568028 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dd46-account-create-j255n" event={"ID":"9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b","Type":"ContainerDied","Data":"490642377d84e04f9b33a361c5af3b4470568cbcb85ec0f487211ece75e70252"} Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.570882 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2f5f99aa-dba6-465b-866a-1e293ba51685","Type":"ContainerStarted","Data":"daa7a9cd848675475ad1e8821bcf022f7f368dccbd8b463091be4977cebba766"} Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.571373 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.572746 4835 generic.go:334] "Generic (PLEG): container finished" podID="31260cd4-00fe-424d-9111-32de1fcae207" containerID="eda660eb2c65a655847c41b971837f189a5f1c84018b25b336b7f65aebd0334c" exitCode=0 Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.572810 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c92d-account-create-2tkrd" event={"ID":"31260cd4-00fe-424d-9111-32de1fcae207","Type":"ContainerDied","Data":"eda660eb2c65a655847c41b971837f189a5f1c84018b25b336b7f65aebd0334c"} Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.573006 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c92d-account-create-2tkrd" event={"ID":"31260cd4-00fe-424d-9111-32de1fcae207","Type":"ContainerStarted","Data":"3bfbcfe397d46d7314e034c57a50577aad1a7bbc7a2db4c1c23bd7cc4dc64125"} Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.574326 4835 generic.go:334] "Generic (PLEG): container finished" podID="30caa8b9-b964-4ea7-b39e-2e61a7e9935b" containerID="dab9ce77bce4472a506459f98e1685e2f0ed108cb68a34365d95023f2d404c24" exitCode=0 Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.574370 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plxn4-config-vlk78" event={"ID":"30caa8b9-b964-4ea7-b39e-2e61a7e9935b","Type":"ContainerDied","Data":"dab9ce77bce4472a506459f98e1685e2f0ed108cb68a34365d95023f2d404c24"} Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.574384 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plxn4-config-vlk78" event={"ID":"30caa8b9-b964-4ea7-b39e-2e61a7e9935b","Type":"ContainerStarted","Data":"dd7f46912c2ec538c6f2eb502a0d4f411e40922886e63a530e8d8d42893efdcd"} Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.576437 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6fd26bdb-868b-49db-9698-e7c79eea5cef","Type":"ContainerStarted","Data":"45412f9b1cc503ca1d65287ceb8ce89e2ec70b786e4f3f800c94087f759517e0"} Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.576854 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.577998 4835 generic.go:334] "Generic (PLEG): container finished" podID="5816f3db-fcba-4c2f-872e-0bdf53f7e7df" containerID="5ada613162cff0cebd48cff9da0bb102dfd8a8bc1ef58c7d504870a3bc5e235f" exitCode=0 Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.578041 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-41e5-account-create-zlkgq" event={"ID":"5816f3db-fcba-4c2f-872e-0bdf53f7e7df","Type":"ContainerDied","Data":"5ada613162cff0cebd48cff9da0bb102dfd8a8bc1ef58c7d504870a3bc5e235f"} Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.580445 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"b17ce629-9abd-42ba-8004-cc4b85cee405","Type":"ContainerStarted","Data":"3185b890a307373e84de1011ceccacdf28e28ed1f1ba5630fb0e8372c332e556"} Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.580821 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.626370 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=57.203839654 podStartE2EDuration="1m7.62634132s" podCreationTimestamp="2025-10-03 18:30:15 +0000 UTC" firstStartedPulling="2025-10-03 18:30:28.944243128 +0000 UTC m=+970.660184000" lastFinishedPulling="2025-10-03 18:30:39.366744794 +0000 UTC m=+981.082685666" observedRunningTime="2025-10-03 18:31:22.620582058 +0000 UTC m=+1024.336522920" watchObservedRunningTime="2025-10-03 18:31:22.62634132 +0000 UTC m=+1024.342282202" Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.644042 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=57.134605114 podStartE2EDuration="1m7.644024795s" podCreationTimestamp="2025-10-03 18:30:15 +0000 UTC" firstStartedPulling="2025-10-03 18:30:29.481849595 +0000 UTC m=+971.197790467" lastFinishedPulling="2025-10-03 18:30:39.991269276 +0000 UTC m=+981.707210148" observedRunningTime="2025-10-03 18:31:22.643177013 +0000 UTC m=+1024.359117885" watchObservedRunningTime="2025-10-03 18:31:22.644024795 +0000 UTC m=+1024.359965657" Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.704736 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=56.879977214 podStartE2EDuration="1m7.704718606s" podCreationTimestamp="2025-10-03 18:30:15 +0000 UTC" firstStartedPulling="2025-10-03 18:30:29.691593471 +0000 UTC m=+971.407534343" lastFinishedPulling="2025-10-03 18:30:40.516334853 +0000 UTC m=+982.232275735" observedRunningTime="2025-10-03 18:31:22.703440755 +0000 UTC m=+1024.419381647" watchObservedRunningTime="2025-10-03 18:31:22.704718606 +0000 UTC m=+1024.420659478" Oct 03 18:31:22 crc kubenswrapper[4835]: I1003 18:31:22.946567 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41e5-account-create-zlkgq" Oct 03 18:31:23 crc kubenswrapper[4835]: I1003 18:31:23.084468 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4rjs\" (UniqueName: \"kubernetes.io/projected/5816f3db-fcba-4c2f-872e-0bdf53f7e7df-kube-api-access-r4rjs\") pod \"5816f3db-fcba-4c2f-872e-0bdf53f7e7df\" (UID: \"5816f3db-fcba-4c2f-872e-0bdf53f7e7df\") " Oct 03 18:31:23 crc kubenswrapper[4835]: I1003 18:31:23.089454 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5816f3db-fcba-4c2f-872e-0bdf53f7e7df-kube-api-access-r4rjs" (OuterVolumeSpecName: "kube-api-access-r4rjs") pod "5816f3db-fcba-4c2f-872e-0bdf53f7e7df" (UID: "5816f3db-fcba-4c2f-872e-0bdf53f7e7df"). InnerVolumeSpecName "kube-api-access-r4rjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:23 crc kubenswrapper[4835]: I1003 18:31:23.185982 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4rjs\" (UniqueName: \"kubernetes.io/projected/5816f3db-fcba-4c2f-872e-0bdf53f7e7df-kube-api-access-r4rjs\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:23 crc kubenswrapper[4835]: I1003 18:31:23.591398 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-41e5-account-create-zlkgq" event={"ID":"5816f3db-fcba-4c2f-872e-0bdf53f7e7df","Type":"ContainerDied","Data":"3d25b7496f5042748dd0953617c09da7711c8ea7538ba998082fe6fcdcb691fd"} Oct 03 18:31:23 crc kubenswrapper[4835]: I1003 18:31:23.591444 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d25b7496f5042748dd0953617c09da7711c8ea7538ba998082fe6fcdcb691fd" Oct 03 18:31:23 crc kubenswrapper[4835]: I1003 18:31:23.591589 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-41e5-account-create-zlkgq" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.086530 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dd46-account-create-j255n" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.208588 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kh42\" (UniqueName: \"kubernetes.io/projected/9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b-kube-api-access-6kh42\") pod \"9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b\" (UID: \"9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b\") " Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.214349 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b-kube-api-access-6kh42" (OuterVolumeSpecName: "kube-api-access-6kh42") pod "9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b" (UID: "9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b"). InnerVolumeSpecName "kube-api-access-6kh42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.221985 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.227534 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c92d-account-create-2tkrd" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.310461 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-log-ovn\") pod \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.310589 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zwgt\" (UniqueName: \"kubernetes.io/projected/31260cd4-00fe-424d-9111-32de1fcae207-kube-api-access-6zwgt\") pod \"31260cd4-00fe-424d-9111-32de1fcae207\" (UID: \"31260cd4-00fe-424d-9111-32de1fcae207\") " Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.310597 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "30caa8b9-b964-4ea7-b39e-2e61a7e9935b" (UID: "30caa8b9-b964-4ea7-b39e-2e61a7e9935b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.310636 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-run\") pod \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.310674 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-scripts\") pod \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.310692 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-run-ovn\") pod \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.310696 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-run" (OuterVolumeSpecName: "var-run") pod "30caa8b9-b964-4ea7-b39e-2e61a7e9935b" (UID: "30caa8b9-b964-4ea7-b39e-2e61a7e9935b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.310749 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-additional-scripts\") pod \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.310803 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx8sj\" (UniqueName: \"kubernetes.io/projected/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-kube-api-access-wx8sj\") pod \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\" (UID: \"30caa8b9-b964-4ea7-b39e-2e61a7e9935b\") " Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.310818 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "30caa8b9-b964-4ea7-b39e-2e61a7e9935b" (UID: "30caa8b9-b964-4ea7-b39e-2e61a7e9935b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.311271 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kh42\" (UniqueName: \"kubernetes.io/projected/9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b-kube-api-access-6kh42\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.311297 4835 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.311310 4835 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.311322 4835 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.311617 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "30caa8b9-b964-4ea7-b39e-2e61a7e9935b" (UID: "30caa8b9-b964-4ea7-b39e-2e61a7e9935b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.311865 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-scripts" (OuterVolumeSpecName: "scripts") pod "30caa8b9-b964-4ea7-b39e-2e61a7e9935b" (UID: "30caa8b9-b964-4ea7-b39e-2e61a7e9935b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.314434 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-kube-api-access-wx8sj" (OuterVolumeSpecName: "kube-api-access-wx8sj") pod "30caa8b9-b964-4ea7-b39e-2e61a7e9935b" (UID: "30caa8b9-b964-4ea7-b39e-2e61a7e9935b"). InnerVolumeSpecName "kube-api-access-wx8sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.315275 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31260cd4-00fe-424d-9111-32de1fcae207-kube-api-access-6zwgt" (OuterVolumeSpecName: "kube-api-access-6zwgt") pod "31260cd4-00fe-424d-9111-32de1fcae207" (UID: "31260cd4-00fe-424d-9111-32de1fcae207"). InnerVolumeSpecName "kube-api-access-6zwgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.413090 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zwgt\" (UniqueName: \"kubernetes.io/projected/31260cd4-00fe-424d-9111-32de1fcae207-kube-api-access-6zwgt\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.413130 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.413140 4835 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.413150 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx8sj\" (UniqueName: \"kubernetes.io/projected/30caa8b9-b964-4ea7-b39e-2e61a7e9935b-kube-api-access-wx8sj\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.600965 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c92d-account-create-2tkrd" event={"ID":"31260cd4-00fe-424d-9111-32de1fcae207","Type":"ContainerDied","Data":"3bfbcfe397d46d7314e034c57a50577aad1a7bbc7a2db4c1c23bd7cc4dc64125"} Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.601002 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bfbcfe397d46d7314e034c57a50577aad1a7bbc7a2db4c1c23bd7cc4dc64125" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.600978 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c92d-account-create-2tkrd" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.602283 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plxn4-config-vlk78" event={"ID":"30caa8b9-b964-4ea7-b39e-2e61a7e9935b","Type":"ContainerDied","Data":"dd7f46912c2ec538c6f2eb502a0d4f411e40922886e63a530e8d8d42893efdcd"} Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.602311 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd7f46912c2ec538c6f2eb502a0d4f411e40922886e63a530e8d8d42893efdcd" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.602359 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plxn4-config-vlk78" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.607827 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-dd46-account-create-j255n" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.607826 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-dd46-account-create-j255n" event={"ID":"9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b","Type":"ContainerDied","Data":"18b487200a46a0375fe7dbd97c41d4d54c283a206cd7f26c564b21af3933d71f"} Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.607967 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18b487200a46a0375fe7dbd97c41d4d54c283a206cd7f26c564b21af3933d71f" Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.609093 4835 generic.go:334] "Generic (PLEG): container finished" podID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerID="93b2837e510576d8bb572b89f16db0212e746865c19802cde66953cd8f7661d8" exitCode=0 Oct 03 18:31:24 crc kubenswrapper[4835]: I1003 18:31:24.609127 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c","Type":"ContainerDied","Data":"93b2837e510576d8bb572b89f16db0212e746865c19802cde66953cd8f7661d8"} Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.313109 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-plxn4-config-vlk78"] Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.322706 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-plxn4-config-vlk78"] Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.415227 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-plxn4-config-pgfpp"] Oct 03 18:31:25 crc kubenswrapper[4835]: E1003 18:31:25.415874 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30caa8b9-b964-4ea7-b39e-2e61a7e9935b" containerName="ovn-config" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.415972 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="30caa8b9-b964-4ea7-b39e-2e61a7e9935b" containerName="ovn-config" Oct 03 18:31:25 crc kubenswrapper[4835]: E1003 18:31:25.416122 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5816f3db-fcba-4c2f-872e-0bdf53f7e7df" containerName="mariadb-account-create" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.416203 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5816f3db-fcba-4c2f-872e-0bdf53f7e7df" containerName="mariadb-account-create" Oct 03 18:31:25 crc kubenswrapper[4835]: E1003 18:31:25.416278 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b" containerName="mariadb-account-create" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.416358 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b" containerName="mariadb-account-create" Oct 03 18:31:25 crc kubenswrapper[4835]: E1003 18:31:25.416450 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31260cd4-00fe-424d-9111-32de1fcae207" containerName="mariadb-account-create" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.416530 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="31260cd4-00fe-424d-9111-32de1fcae207" containerName="mariadb-account-create" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.416803 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5816f3db-fcba-4c2f-872e-0bdf53f7e7df" containerName="mariadb-account-create" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.416903 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="31260cd4-00fe-424d-9111-32de1fcae207" containerName="mariadb-account-create" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.416984 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="30caa8b9-b964-4ea7-b39e-2e61a7e9935b" containerName="ovn-config" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.417084 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b" containerName="mariadb-account-create" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.417886 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.425125 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-plxn4-config-pgfpp"] Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.428248 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-log-ovn\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.428287 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-run-ovn\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.428318 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b70ca8c6-6107-4063-b114-5d2f2b249c23-additional-scripts\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.428288 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.428386 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b70ca8c6-6107-4063-b114-5d2f2b249c23-scripts\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.428418 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-run\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.428469 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j85qp\" (UniqueName: \"kubernetes.io/projected/b70ca8c6-6107-4063-b114-5d2f2b249c23-kube-api-access-j85qp\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.530525 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b70ca8c6-6107-4063-b114-5d2f2b249c23-additional-scripts\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.530629 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b70ca8c6-6107-4063-b114-5d2f2b249c23-scripts\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.530662 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-run\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.530713 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j85qp\" (UniqueName: \"kubernetes.io/projected/b70ca8c6-6107-4063-b114-5d2f2b249c23-kube-api-access-j85qp\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.530761 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-log-ovn\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.530781 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-run-ovn\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.531041 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-run-ovn\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.531134 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-log-ovn\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.531177 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-run\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.531492 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b70ca8c6-6107-4063-b114-5d2f2b249c23-additional-scripts\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.532788 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b70ca8c6-6107-4063-b114-5d2f2b249c23-scripts\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.549568 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j85qp\" (UniqueName: \"kubernetes.io/projected/b70ca8c6-6107-4063-b114-5d2f2b249c23-kube-api-access-j85qp\") pod \"ovn-controller-plxn4-config-pgfpp\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.632655 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c","Type":"ContainerStarted","Data":"08b6fbb1fe627ec9a53ec7af83710b0cc79ef5a3e7770f68b29545574299d4c2"} Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.733606 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:25 crc kubenswrapper[4835]: I1003 18:31:25.917094 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-plxn4" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.193550 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-plxn4-config-pgfpp"] Oct 03 18:31:26 crc kubenswrapper[4835]: W1003 18:31:26.195468 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb70ca8c6_6107_4063_b114_5d2f2b249c23.slice/crio-81384af941e313ca2769a42987079d9ad5e1182364c36a00d844d38c3632b02f WatchSource:0}: Error finding container 81384af941e313ca2769a42987079d9ad5e1182364c36a00d844d38c3632b02f: Status 404 returned error can't find the container with id 81384af941e313ca2769a42987079d9ad5e1182364c36a00d844d38c3632b02f Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.243204 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9jg6f"] Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.245097 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.250331 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-z6gzn" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.250579 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.259446 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9jg6f"] Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.349384 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-db-sync-config-data\") pod \"glance-db-sync-9jg6f\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.349456 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-config-data\") pod \"glance-db-sync-9jg6f\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.349486 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtrrk\" (UniqueName: \"kubernetes.io/projected/94622de9-5048-41be-875b-dc37acc7eba4-kube-api-access-mtrrk\") pod \"glance-db-sync-9jg6f\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.349600 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-combined-ca-bundle\") pod \"glance-db-sync-9jg6f\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.451156 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-config-data\") pod \"glance-db-sync-9jg6f\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.451225 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtrrk\" (UniqueName: \"kubernetes.io/projected/94622de9-5048-41be-875b-dc37acc7eba4-kube-api-access-mtrrk\") pod \"glance-db-sync-9jg6f\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.451261 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-combined-ca-bundle\") pod \"glance-db-sync-9jg6f\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.451416 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-db-sync-config-data\") pod \"glance-db-sync-9jg6f\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.520910 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-combined-ca-bundle\") pod \"glance-db-sync-9jg6f\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.524788 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-config-data\") pod \"glance-db-sync-9jg6f\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.526801 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-db-sync-config-data\") pod \"glance-db-sync-9jg6f\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.531567 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtrrk\" (UniqueName: \"kubernetes.io/projected/94622de9-5048-41be-875b-dc37acc7eba4-kube-api-access-mtrrk\") pod \"glance-db-sync-9jg6f\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.606643 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9jg6f" Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.650225 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plxn4-config-pgfpp" event={"ID":"b70ca8c6-6107-4063-b114-5d2f2b249c23","Type":"ContainerStarted","Data":"81384af941e313ca2769a42987079d9ad5e1182364c36a00d844d38c3632b02f"} Oct 03 18:31:26 crc kubenswrapper[4835]: I1003 18:31:26.895559 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30caa8b9-b964-4ea7-b39e-2e61a7e9935b" path="/var/lib/kubelet/pods/30caa8b9-b964-4ea7-b39e-2e61a7e9935b/volumes" Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.042347 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.111645 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4d484c5-nhr9s"] Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.111881 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" podUID="dbd81335-256c-4b39-bb05-096ba524b652" containerName="dnsmasq-dns" containerID="cri-o://c03fbcfbbc7d49b7a970051bce876846f100e18c6ef3d83a8388563ff2e963af" gracePeriod=10 Oct 03 18:31:27 crc kubenswrapper[4835]: W1003 18:31:27.228492 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94622de9_5048_41be_875b_dc37acc7eba4.slice/crio-97fa768958ebc3c8836b5b5440f17ff91dd92ef1d5d93c94f91358f301a3a0be WatchSource:0}: Error finding container 97fa768958ebc3c8836b5b5440f17ff91dd92ef1d5d93c94f91358f301a3a0be: Status 404 returned error can't find the container with id 97fa768958ebc3c8836b5b5440f17ff91dd92ef1d5d93c94f91358f301a3a0be Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.240799 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9jg6f"] Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.665554 4835 generic.go:334] "Generic (PLEG): container finished" podID="b70ca8c6-6107-4063-b114-5d2f2b249c23" containerID="e419d1b2ee6873736a130d7289602eb9547f62118a25f036775816834ffc3d68" exitCode=0 Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.665841 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plxn4-config-pgfpp" event={"ID":"b70ca8c6-6107-4063-b114-5d2f2b249c23","Type":"ContainerDied","Data":"e419d1b2ee6873736a130d7289602eb9547f62118a25f036775816834ffc3d68"} Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.668138 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9jg6f" event={"ID":"94622de9-5048-41be-875b-dc37acc7eba4","Type":"ContainerStarted","Data":"97fa768958ebc3c8836b5b5440f17ff91dd92ef1d5d93c94f91358f301a3a0be"} Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.670035 4835 generic.go:334] "Generic (PLEG): container finished" podID="dbd81335-256c-4b39-bb05-096ba524b652" containerID="c03fbcfbbc7d49b7a970051bce876846f100e18c6ef3d83a8388563ff2e963af" exitCode=0 Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.670100 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" event={"ID":"dbd81335-256c-4b39-bb05-096ba524b652","Type":"ContainerDied","Data":"c03fbcfbbc7d49b7a970051bce876846f100e18c6ef3d83a8388563ff2e963af"} Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.673150 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c","Type":"ContainerStarted","Data":"a71bcb22bebacf36d00c0d1c06e34c37779006bb91d8db2e7161bf68a5112732"} Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.770376 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.872810 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-config\") pod \"dbd81335-256c-4b39-bb05-096ba524b652\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.872879 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-dns-svc\") pod \"dbd81335-256c-4b39-bb05-096ba524b652\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.872939 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-ovsdbserver-nb\") pod \"dbd81335-256c-4b39-bb05-096ba524b652\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.872959 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxfv9\" (UniqueName: \"kubernetes.io/projected/dbd81335-256c-4b39-bb05-096ba524b652-kube-api-access-jxfv9\") pod \"dbd81335-256c-4b39-bb05-096ba524b652\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.872980 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-ovsdbserver-sb\") pod \"dbd81335-256c-4b39-bb05-096ba524b652\" (UID: \"dbd81335-256c-4b39-bb05-096ba524b652\") " Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.881640 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd81335-256c-4b39-bb05-096ba524b652-kube-api-access-jxfv9" (OuterVolumeSpecName: "kube-api-access-jxfv9") pod "dbd81335-256c-4b39-bb05-096ba524b652" (UID: "dbd81335-256c-4b39-bb05-096ba524b652"). InnerVolumeSpecName "kube-api-access-jxfv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.914809 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dbd81335-256c-4b39-bb05-096ba524b652" (UID: "dbd81335-256c-4b39-bb05-096ba524b652"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.916207 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dbd81335-256c-4b39-bb05-096ba524b652" (UID: "dbd81335-256c-4b39-bb05-096ba524b652"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.919748 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dbd81335-256c-4b39-bb05-096ba524b652" (UID: "dbd81335-256c-4b39-bb05-096ba524b652"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.924214 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-config" (OuterVolumeSpecName: "config") pod "dbd81335-256c-4b39-bb05-096ba524b652" (UID: "dbd81335-256c-4b39-bb05-096ba524b652"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.975242 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.975286 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxfv9\" (UniqueName: \"kubernetes.io/projected/dbd81335-256c-4b39-bb05-096ba524b652-kube-api-access-jxfv9\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.975296 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.975306 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:27 crc kubenswrapper[4835]: I1003 18:31:27.975314 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dbd81335-256c-4b39-bb05-096ba524b652-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:28 crc kubenswrapper[4835]: I1003 18:31:28.681830 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" event={"ID":"dbd81335-256c-4b39-bb05-096ba524b652","Type":"ContainerDied","Data":"03196989d955ed020a12a52b2170aa2ad5a65737f0fcfd6ba059006f5bda2e31"} Oct 03 18:31:28 crc kubenswrapper[4835]: I1003 18:31:28.682244 4835 scope.go:117] "RemoveContainer" containerID="c03fbcfbbc7d49b7a970051bce876846f100e18c6ef3d83a8388563ff2e963af" Oct 03 18:31:28 crc kubenswrapper[4835]: I1003 18:31:28.681877 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" Oct 03 18:31:28 crc kubenswrapper[4835]: I1003 18:31:28.683885 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c","Type":"ContainerStarted","Data":"ebc74297fb9b68d64cde4db4e2b3db9543551fe1ebda3c51b5f172bd8d17caa8"} Oct 03 18:31:28 crc kubenswrapper[4835]: I1003 18:31:28.702995 4835 scope.go:117] "RemoveContainer" containerID="02bdb55047f68f3bbab1cd69d2927a82fccb8f5e5d281a634332057c9fcdcd00" Oct 03 18:31:28 crc kubenswrapper[4835]: I1003 18:31:28.723866 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.723847415 podStartE2EDuration="15.723847415s" podCreationTimestamp="2025-10-03 18:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:31:28.719401606 +0000 UTC m=+1030.435342488" watchObservedRunningTime="2025-10-03 18:31:28.723847415 +0000 UTC m=+1030.439788287" Oct 03 18:31:28 crc kubenswrapper[4835]: I1003 18:31:28.738445 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4d484c5-nhr9s"] Oct 03 18:31:28 crc kubenswrapper[4835]: I1003 18:31:28.759779 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4d484c5-nhr9s"] Oct 03 18:31:28 crc kubenswrapper[4835]: I1003 18:31:28.813403 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:28 crc kubenswrapper[4835]: I1003 18:31:28.813454 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:28 crc kubenswrapper[4835]: I1003 18:31:28.822827 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:28 crc kubenswrapper[4835]: I1003 18:31:28.905765 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd81335-256c-4b39-bb05-096ba524b652" path="/var/lib/kubelet/pods/dbd81335-256c-4b39-bb05-096ba524b652/volumes" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.045590 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.094680 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-run-ovn\") pod \"b70ca8c6-6107-4063-b114-5d2f2b249c23\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.094794 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b70ca8c6-6107-4063-b114-5d2f2b249c23-scripts\") pod \"b70ca8c6-6107-4063-b114-5d2f2b249c23\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.094793 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b70ca8c6-6107-4063-b114-5d2f2b249c23" (UID: "b70ca8c6-6107-4063-b114-5d2f2b249c23"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.094854 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-log-ovn\") pod \"b70ca8c6-6107-4063-b114-5d2f2b249c23\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.094891 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j85qp\" (UniqueName: \"kubernetes.io/projected/b70ca8c6-6107-4063-b114-5d2f2b249c23-kube-api-access-j85qp\") pod \"b70ca8c6-6107-4063-b114-5d2f2b249c23\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.094980 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b70ca8c6-6107-4063-b114-5d2f2b249c23-additional-scripts\") pod \"b70ca8c6-6107-4063-b114-5d2f2b249c23\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.094955 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b70ca8c6-6107-4063-b114-5d2f2b249c23" (UID: "b70ca8c6-6107-4063-b114-5d2f2b249c23"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.095043 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-run\") pod \"b70ca8c6-6107-4063-b114-5d2f2b249c23\" (UID: \"b70ca8c6-6107-4063-b114-5d2f2b249c23\") " Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.095475 4835 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.095500 4835 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.095535 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-run" (OuterVolumeSpecName: "var-run") pod "b70ca8c6-6107-4063-b114-5d2f2b249c23" (UID: "b70ca8c6-6107-4063-b114-5d2f2b249c23"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.096011 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b70ca8c6-6107-4063-b114-5d2f2b249c23-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b70ca8c6-6107-4063-b114-5d2f2b249c23" (UID: "b70ca8c6-6107-4063-b114-5d2f2b249c23"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.096127 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b70ca8c6-6107-4063-b114-5d2f2b249c23-scripts" (OuterVolumeSpecName: "scripts") pod "b70ca8c6-6107-4063-b114-5d2f2b249c23" (UID: "b70ca8c6-6107-4063-b114-5d2f2b249c23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.101948 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b70ca8c6-6107-4063-b114-5d2f2b249c23-kube-api-access-j85qp" (OuterVolumeSpecName: "kube-api-access-j85qp") pod "b70ca8c6-6107-4063-b114-5d2f2b249c23" (UID: "b70ca8c6-6107-4063-b114-5d2f2b249c23"). InnerVolumeSpecName "kube-api-access-j85qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.196773 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b70ca8c6-6107-4063-b114-5d2f2b249c23-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.196803 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j85qp\" (UniqueName: \"kubernetes.io/projected/b70ca8c6-6107-4063-b114-5d2f2b249c23-kube-api-access-j85qp\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.196812 4835 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b70ca8c6-6107-4063-b114-5d2f2b249c23-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.196822 4835 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b70ca8c6-6107-4063-b114-5d2f2b249c23-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.694760 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-plxn4-config-pgfpp" event={"ID":"b70ca8c6-6107-4063-b114-5d2f2b249c23","Type":"ContainerDied","Data":"81384af941e313ca2769a42987079d9ad5e1182364c36a00d844d38c3632b02f"} Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.694783 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-plxn4-config-pgfpp" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.694803 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81384af941e313ca2769a42987079d9ad5e1182364c36a00d844d38c3632b02f" Oct 03 18:31:29 crc kubenswrapper[4835]: I1003 18:31:29.702812 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 03 18:31:30 crc kubenswrapper[4835]: I1003 18:31:30.126579 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-plxn4-config-pgfpp"] Oct 03 18:31:30 crc kubenswrapper[4835]: I1003 18:31:30.133815 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-plxn4-config-pgfpp"] Oct 03 18:31:30 crc kubenswrapper[4835]: I1003 18:31:30.887989 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b70ca8c6-6107-4063-b114-5d2f2b249c23" path="/var/lib/kubelet/pods/b70ca8c6-6107-4063-b114-5d2f2b249c23/volumes" Oct 03 18:31:32 crc kubenswrapper[4835]: I1003 18:31:32.761142 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bb4d484c5-nhr9s" podUID="dbd81335-256c-4b39-bb05-096ba524b652" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: i/o timeout" Oct 03 18:31:36 crc kubenswrapper[4835]: I1003 18:31:36.518412 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6fd26bdb-868b-49db-9698-e7c79eea5cef" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Oct 03 18:31:36 crc kubenswrapper[4835]: I1003 18:31:36.805283 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2f5f99aa-dba6-465b-866a-1e293ba51685" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Oct 03 18:31:37 crc kubenswrapper[4835]: I1003 18:31:37.083373 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="b17ce629-9abd-42ba-8004-cc4b85cee405" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Oct 03 18:31:42 crc kubenswrapper[4835]: E1003 18:31:42.095462 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Oct 03 18:31:42 crc kubenswrapper[4835]: E1003 18:31:42.096187 4835 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Oct 03 18:31:42 crc kubenswrapper[4835]: E1003 18:31:42.096424 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:38.102.83.82:5001/podified-master-centos10/openstack-glance-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mtrrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-9jg6f_openstack(94622de9-5048-41be-875b-dc37acc7eba4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 18:31:42 crc kubenswrapper[4835]: E1003 18:31:42.098763 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-9jg6f" podUID="94622de9-5048-41be-875b-dc37acc7eba4" Oct 03 18:31:42 crc kubenswrapper[4835]: E1003 18:31:42.793921 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.82:5001/podified-master-centos10/openstack-glance-api:watcher_latest\\\"\"" pod="openstack/glance-db-sync-9jg6f" podUID="94622de9-5048-41be-875b-dc37acc7eba4" Oct 03 18:31:46 crc kubenswrapper[4835]: I1003 18:31:46.518608 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 18:31:46 crc kubenswrapper[4835]: I1003 18:31:46.804237 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:31:46 crc kubenswrapper[4835]: I1003 18:31:46.977836 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-dk4ps"] Oct 03 18:31:46 crc kubenswrapper[4835]: E1003 18:31:46.978170 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd81335-256c-4b39-bb05-096ba524b652" containerName="dnsmasq-dns" Oct 03 18:31:46 crc kubenswrapper[4835]: I1003 18:31:46.978182 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd81335-256c-4b39-bb05-096ba524b652" containerName="dnsmasq-dns" Oct 03 18:31:46 crc kubenswrapper[4835]: E1003 18:31:46.978201 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd81335-256c-4b39-bb05-096ba524b652" containerName="init" Oct 03 18:31:46 crc kubenswrapper[4835]: I1003 18:31:46.978207 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd81335-256c-4b39-bb05-096ba524b652" containerName="init" Oct 03 18:31:46 crc kubenswrapper[4835]: E1003 18:31:46.978232 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70ca8c6-6107-4063-b114-5d2f2b249c23" containerName="ovn-config" Oct 03 18:31:46 crc kubenswrapper[4835]: I1003 18:31:46.978239 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70ca8c6-6107-4063-b114-5d2f2b249c23" containerName="ovn-config" Oct 03 18:31:46 crc kubenswrapper[4835]: I1003 18:31:46.978402 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70ca8c6-6107-4063-b114-5d2f2b249c23" containerName="ovn-config" Oct 03 18:31:46 crc kubenswrapper[4835]: I1003 18:31:46.978422 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd81335-256c-4b39-bb05-096ba524b652" containerName="dnsmasq-dns" Oct 03 18:31:46 crc kubenswrapper[4835]: I1003 18:31:46.978933 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dk4ps" Oct 03 18:31:46 crc kubenswrapper[4835]: I1003 18:31:46.998460 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dk4ps"] Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.072747 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-wp92w"] Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.073806 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wp92w" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.081269 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6kr6\" (UniqueName: \"kubernetes.io/projected/bc7fc2c2-1936-44ed-ba40-1baa10908df0-kube-api-access-v6kr6\") pod \"cinder-db-create-dk4ps\" (UID: \"bc7fc2c2-1936-44ed-ba40-1baa10908df0\") " pod="openstack/cinder-db-create-dk4ps" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.083280 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.089362 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wp92w"] Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.185132 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhl6g\" (UniqueName: \"kubernetes.io/projected/a9162685-16b5-45ab-824b-6d5cbd3e3d98-kube-api-access-nhl6g\") pod \"barbican-db-create-wp92w\" (UID: \"a9162685-16b5-45ab-824b-6d5cbd3e3d98\") " pod="openstack/barbican-db-create-wp92w" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.185181 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6kr6\" (UniqueName: \"kubernetes.io/projected/bc7fc2c2-1936-44ed-ba40-1baa10908df0-kube-api-access-v6kr6\") pod \"cinder-db-create-dk4ps\" (UID: \"bc7fc2c2-1936-44ed-ba40-1baa10908df0\") " pod="openstack/cinder-db-create-dk4ps" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.215273 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6kr6\" (UniqueName: \"kubernetes.io/projected/bc7fc2c2-1936-44ed-ba40-1baa10908df0-kube-api-access-v6kr6\") pod \"cinder-db-create-dk4ps\" (UID: \"bc7fc2c2-1936-44ed-ba40-1baa10908df0\") " pod="openstack/cinder-db-create-dk4ps" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.235699 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-g9qx7"] Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.236827 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g9qx7" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.240155 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vd8gf" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.240353 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.240456 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.240549 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.251712 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-g9qx7"] Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.286373 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhl6g\" (UniqueName: \"kubernetes.io/projected/a9162685-16b5-45ab-824b-6d5cbd3e3d98-kube-api-access-nhl6g\") pod \"barbican-db-create-wp92w\" (UID: \"a9162685-16b5-45ab-824b-6d5cbd3e3d98\") " pod="openstack/barbican-db-create-wp92w" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.297848 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dk4ps" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.317987 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhl6g\" (UniqueName: \"kubernetes.io/projected/a9162685-16b5-45ab-824b-6d5cbd3e3d98-kube-api-access-nhl6g\") pod \"barbican-db-create-wp92w\" (UID: \"a9162685-16b5-45ab-824b-6d5cbd3e3d98\") " pod="openstack/barbican-db-create-wp92w" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.373294 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-t6nss"] Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.374549 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t6nss" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.385621 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-t6nss"] Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.387509 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07b0cbfd-5980-4408-927d-6e8b474a09a7-config-data\") pod \"keystone-db-sync-g9qx7\" (UID: \"07b0cbfd-5980-4408-927d-6e8b474a09a7\") " pod="openstack/keystone-db-sync-g9qx7" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.387658 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh5vq\" (UniqueName: \"kubernetes.io/projected/07b0cbfd-5980-4408-927d-6e8b474a09a7-kube-api-access-kh5vq\") pod \"keystone-db-sync-g9qx7\" (UID: \"07b0cbfd-5980-4408-927d-6e8b474a09a7\") " pod="openstack/keystone-db-sync-g9qx7" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.387695 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b0cbfd-5980-4408-927d-6e8b474a09a7-combined-ca-bundle\") pod \"keystone-db-sync-g9qx7\" (UID: \"07b0cbfd-5980-4408-927d-6e8b474a09a7\") " pod="openstack/keystone-db-sync-g9qx7" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.396405 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wp92w" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.488805 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh5vq\" (UniqueName: \"kubernetes.io/projected/07b0cbfd-5980-4408-927d-6e8b474a09a7-kube-api-access-kh5vq\") pod \"keystone-db-sync-g9qx7\" (UID: \"07b0cbfd-5980-4408-927d-6e8b474a09a7\") " pod="openstack/keystone-db-sync-g9qx7" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.489143 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b0cbfd-5980-4408-927d-6e8b474a09a7-combined-ca-bundle\") pod \"keystone-db-sync-g9qx7\" (UID: \"07b0cbfd-5980-4408-927d-6e8b474a09a7\") " pod="openstack/keystone-db-sync-g9qx7" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.489205 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4vj4\" (UniqueName: \"kubernetes.io/projected/1223b6d4-6430-46e3-8df7-de4952fe563e-kube-api-access-b4vj4\") pod \"neutron-db-create-t6nss\" (UID: \"1223b6d4-6430-46e3-8df7-de4952fe563e\") " pod="openstack/neutron-db-create-t6nss" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.489227 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07b0cbfd-5980-4408-927d-6e8b474a09a7-config-data\") pod \"keystone-db-sync-g9qx7\" (UID: \"07b0cbfd-5980-4408-927d-6e8b474a09a7\") " pod="openstack/keystone-db-sync-g9qx7" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.497673 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b0cbfd-5980-4408-927d-6e8b474a09a7-combined-ca-bundle\") pod \"keystone-db-sync-g9qx7\" (UID: \"07b0cbfd-5980-4408-927d-6e8b474a09a7\") " pod="openstack/keystone-db-sync-g9qx7" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.504847 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07b0cbfd-5980-4408-927d-6e8b474a09a7-config-data\") pod \"keystone-db-sync-g9qx7\" (UID: \"07b0cbfd-5980-4408-927d-6e8b474a09a7\") " pod="openstack/keystone-db-sync-g9qx7" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.510229 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh5vq\" (UniqueName: \"kubernetes.io/projected/07b0cbfd-5980-4408-927d-6e8b474a09a7-kube-api-access-kh5vq\") pod \"keystone-db-sync-g9qx7\" (UID: \"07b0cbfd-5980-4408-927d-6e8b474a09a7\") " pod="openstack/keystone-db-sync-g9qx7" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.585903 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g9qx7" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.591909 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4vj4\" (UniqueName: \"kubernetes.io/projected/1223b6d4-6430-46e3-8df7-de4952fe563e-kube-api-access-b4vj4\") pod \"neutron-db-create-t6nss\" (UID: \"1223b6d4-6430-46e3-8df7-de4952fe563e\") " pod="openstack/neutron-db-create-t6nss" Oct 03 18:31:47 crc kubenswrapper[4835]: I1003 18:31:47.615121 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4vj4\" (UniqueName: \"kubernetes.io/projected/1223b6d4-6430-46e3-8df7-de4952fe563e-kube-api-access-b4vj4\") pod \"neutron-db-create-t6nss\" (UID: \"1223b6d4-6430-46e3-8df7-de4952fe563e\") " pod="openstack/neutron-db-create-t6nss" Oct 03 18:31:48 crc kubenswrapper[4835]: I1003 18:31:47.692697 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t6nss" Oct 03 18:31:48 crc kubenswrapper[4835]: I1003 18:31:47.802005 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dk4ps"] Oct 03 18:31:48 crc kubenswrapper[4835]: W1003 18:31:47.846781 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc7fc2c2_1936_44ed_ba40_1baa10908df0.slice/crio-6a58590222a281bd8d5db2623cc3ba762ee31d01f3cf90a55fda311e802f89f5 WatchSource:0}: Error finding container 6a58590222a281bd8d5db2623cc3ba762ee31d01f3cf90a55fda311e802f89f5: Status 404 returned error can't find the container with id 6a58590222a281bd8d5db2623cc3ba762ee31d01f3cf90a55fda311e802f89f5 Oct 03 18:31:48 crc kubenswrapper[4835]: I1003 18:31:48.020605 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wp92w"] Oct 03 18:31:48 crc kubenswrapper[4835]: W1003 18:31:48.023046 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9162685_16b5_45ab_824b_6d5cbd3e3d98.slice/crio-56e2645d9cef997ded65906bb7f05f60e48f180dd407329cc447bcd1ceb66607 WatchSource:0}: Error finding container 56e2645d9cef997ded65906bb7f05f60e48f180dd407329cc447bcd1ceb66607: Status 404 returned error can't find the container with id 56e2645d9cef997ded65906bb7f05f60e48f180dd407329cc447bcd1ceb66607 Oct 03 18:31:48 crc kubenswrapper[4835]: I1003 18:31:48.735302 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-g9qx7"] Oct 03 18:31:48 crc kubenswrapper[4835]: W1003 18:31:48.750240 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07b0cbfd_5980_4408_927d_6e8b474a09a7.slice/crio-fcc58601b86afad184b5a47faea83f2cc458b732e913e2ef38ee284023736cf4 WatchSource:0}: Error finding container fcc58601b86afad184b5a47faea83f2cc458b732e913e2ef38ee284023736cf4: Status 404 returned error can't find the container with id fcc58601b86afad184b5a47faea83f2cc458b732e913e2ef38ee284023736cf4 Oct 03 18:31:48 crc kubenswrapper[4835]: I1003 18:31:48.830559 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-t6nss"] Oct 03 18:31:48 crc kubenswrapper[4835]: W1003 18:31:48.856640 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1223b6d4_6430_46e3_8df7_de4952fe563e.slice/crio-1c5730ef659ae67b5523afe7f2472b4be74eedda6cfea2c662bcf1942160c157 WatchSource:0}: Error finding container 1c5730ef659ae67b5523afe7f2472b4be74eedda6cfea2c662bcf1942160c157: Status 404 returned error can't find the container with id 1c5730ef659ae67b5523afe7f2472b4be74eedda6cfea2c662bcf1942160c157 Oct 03 18:31:48 crc kubenswrapper[4835]: I1003 18:31:48.909255 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t6nss" event={"ID":"1223b6d4-6430-46e3-8df7-de4952fe563e","Type":"ContainerStarted","Data":"1c5730ef659ae67b5523afe7f2472b4be74eedda6cfea2c662bcf1942160c157"} Oct 03 18:31:48 crc kubenswrapper[4835]: I1003 18:31:48.917163 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g9qx7" event={"ID":"07b0cbfd-5980-4408-927d-6e8b474a09a7","Type":"ContainerStarted","Data":"fcc58601b86afad184b5a47faea83f2cc458b732e913e2ef38ee284023736cf4"} Oct 03 18:31:48 crc kubenswrapper[4835]: I1003 18:31:48.919111 4835 generic.go:334] "Generic (PLEG): container finished" podID="bc7fc2c2-1936-44ed-ba40-1baa10908df0" containerID="fa936e2298c5f749805b7e599706afbd4bf89fef32ea343742e3924c933db470" exitCode=0 Oct 03 18:31:48 crc kubenswrapper[4835]: I1003 18:31:48.919262 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dk4ps" event={"ID":"bc7fc2c2-1936-44ed-ba40-1baa10908df0","Type":"ContainerDied","Data":"fa936e2298c5f749805b7e599706afbd4bf89fef32ea343742e3924c933db470"} Oct 03 18:31:48 crc kubenswrapper[4835]: I1003 18:31:48.919355 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dk4ps" event={"ID":"bc7fc2c2-1936-44ed-ba40-1baa10908df0","Type":"ContainerStarted","Data":"6a58590222a281bd8d5db2623cc3ba762ee31d01f3cf90a55fda311e802f89f5"} Oct 03 18:31:48 crc kubenswrapper[4835]: I1003 18:31:48.929432 4835 generic.go:334] "Generic (PLEG): container finished" podID="a9162685-16b5-45ab-824b-6d5cbd3e3d98" containerID="7feb2944e524dc8a34f99172d3aae778a9ac99aa266e0a128498863dac0fc279" exitCode=0 Oct 03 18:31:48 crc kubenswrapper[4835]: I1003 18:31:48.930029 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wp92w" event={"ID":"a9162685-16b5-45ab-824b-6d5cbd3e3d98","Type":"ContainerDied","Data":"7feb2944e524dc8a34f99172d3aae778a9ac99aa266e0a128498863dac0fc279"} Oct 03 18:31:48 crc kubenswrapper[4835]: I1003 18:31:48.930273 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wp92w" event={"ID":"a9162685-16b5-45ab-824b-6d5cbd3e3d98","Type":"ContainerStarted","Data":"56e2645d9cef997ded65906bb7f05f60e48f180dd407329cc447bcd1ceb66607"} Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.662645 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-9bqlb"] Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.663741 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.666897 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.672371 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-plmhx" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.682455 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-9bqlb"] Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.741656 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27fx8\" (UniqueName: \"kubernetes.io/projected/2f585da2-5074-43be-8260-e854b0e3b1a6-kube-api-access-27fx8\") pod \"watcher-db-sync-9bqlb\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.741745 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-combined-ca-bundle\") pod \"watcher-db-sync-9bqlb\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.741779 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-db-sync-config-data\") pod \"watcher-db-sync-9bqlb\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.741892 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-config-data\") pod \"watcher-db-sync-9bqlb\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.843277 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-combined-ca-bundle\") pod \"watcher-db-sync-9bqlb\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.843332 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-db-sync-config-data\") pod \"watcher-db-sync-9bqlb\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.843405 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-config-data\") pod \"watcher-db-sync-9bqlb\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.843437 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27fx8\" (UniqueName: \"kubernetes.io/projected/2f585da2-5074-43be-8260-e854b0e3b1a6-kube-api-access-27fx8\") pod \"watcher-db-sync-9bqlb\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.849278 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-combined-ca-bundle\") pod \"watcher-db-sync-9bqlb\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.849825 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-config-data\") pod \"watcher-db-sync-9bqlb\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.856597 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-db-sync-config-data\") pod \"watcher-db-sync-9bqlb\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.859994 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27fx8\" (UniqueName: \"kubernetes.io/projected/2f585da2-5074-43be-8260-e854b0e3b1a6-kube-api-access-27fx8\") pod \"watcher-db-sync-9bqlb\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.938915 4835 generic.go:334] "Generic (PLEG): container finished" podID="1223b6d4-6430-46e3-8df7-de4952fe563e" containerID="097f5bff0c61523a37b6b642ddae3aa39eb0c0ed802eb3f58b26bd3f4cb64fd2" exitCode=0 Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.939098 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t6nss" event={"ID":"1223b6d4-6430-46e3-8df7-de4952fe563e","Type":"ContainerDied","Data":"097f5bff0c61523a37b6b642ddae3aa39eb0c0ed802eb3f58b26bd3f4cb64fd2"} Oct 03 18:31:49 crc kubenswrapper[4835]: I1003 18:31:49.986168 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.410231 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wp92w" Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.429958 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dk4ps" Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.553423 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6kr6\" (UniqueName: \"kubernetes.io/projected/bc7fc2c2-1936-44ed-ba40-1baa10908df0-kube-api-access-v6kr6\") pod \"bc7fc2c2-1936-44ed-ba40-1baa10908df0\" (UID: \"bc7fc2c2-1936-44ed-ba40-1baa10908df0\") " Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.553532 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhl6g\" (UniqueName: \"kubernetes.io/projected/a9162685-16b5-45ab-824b-6d5cbd3e3d98-kube-api-access-nhl6g\") pod \"a9162685-16b5-45ab-824b-6d5cbd3e3d98\" (UID: \"a9162685-16b5-45ab-824b-6d5cbd3e3d98\") " Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.560036 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9162685-16b5-45ab-824b-6d5cbd3e3d98-kube-api-access-nhl6g" (OuterVolumeSpecName: "kube-api-access-nhl6g") pod "a9162685-16b5-45ab-824b-6d5cbd3e3d98" (UID: "a9162685-16b5-45ab-824b-6d5cbd3e3d98"). InnerVolumeSpecName "kube-api-access-nhl6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.560192 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc7fc2c2-1936-44ed-ba40-1baa10908df0-kube-api-access-v6kr6" (OuterVolumeSpecName: "kube-api-access-v6kr6") pod "bc7fc2c2-1936-44ed-ba40-1baa10908df0" (UID: "bc7fc2c2-1936-44ed-ba40-1baa10908df0"). InnerVolumeSpecName "kube-api-access-v6kr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.562315 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-9bqlb"] Oct 03 18:31:50 crc kubenswrapper[4835]: W1003 18:31:50.567632 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f585da2_5074_43be_8260_e854b0e3b1a6.slice/crio-15541d07176e536500a59b73c015f0fe73a9bae24ec78773251fd0b7aff13959 WatchSource:0}: Error finding container 15541d07176e536500a59b73c015f0fe73a9bae24ec78773251fd0b7aff13959: Status 404 returned error can't find the container with id 15541d07176e536500a59b73c015f0fe73a9bae24ec78773251fd0b7aff13959 Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.655962 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6kr6\" (UniqueName: \"kubernetes.io/projected/bc7fc2c2-1936-44ed-ba40-1baa10908df0-kube-api-access-v6kr6\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.656047 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhl6g\" (UniqueName: \"kubernetes.io/projected/a9162685-16b5-45ab-824b-6d5cbd3e3d98-kube-api-access-nhl6g\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.954433 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-9bqlb" event={"ID":"2f585da2-5074-43be-8260-e854b0e3b1a6","Type":"ContainerStarted","Data":"15541d07176e536500a59b73c015f0fe73a9bae24ec78773251fd0b7aff13959"} Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.957427 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dk4ps" event={"ID":"bc7fc2c2-1936-44ed-ba40-1baa10908df0","Type":"ContainerDied","Data":"6a58590222a281bd8d5db2623cc3ba762ee31d01f3cf90a55fda311e802f89f5"} Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.957458 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dk4ps" Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.957469 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a58590222a281bd8d5db2623cc3ba762ee31d01f3cf90a55fda311e802f89f5" Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.962934 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wp92w" event={"ID":"a9162685-16b5-45ab-824b-6d5cbd3e3d98","Type":"ContainerDied","Data":"56e2645d9cef997ded65906bb7f05f60e48f180dd407329cc447bcd1ceb66607"} Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.962965 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56e2645d9cef997ded65906bb7f05f60e48f180dd407329cc447bcd1ceb66607" Oct 03 18:31:50 crc kubenswrapper[4835]: I1003 18:31:50.963039 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wp92w" Oct 03 18:31:53 crc kubenswrapper[4835]: I1003 18:31:53.358197 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t6nss" Oct 03 18:31:53 crc kubenswrapper[4835]: I1003 18:31:53.505656 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4vj4\" (UniqueName: \"kubernetes.io/projected/1223b6d4-6430-46e3-8df7-de4952fe563e-kube-api-access-b4vj4\") pod \"1223b6d4-6430-46e3-8df7-de4952fe563e\" (UID: \"1223b6d4-6430-46e3-8df7-de4952fe563e\") " Oct 03 18:31:53 crc kubenswrapper[4835]: I1003 18:31:53.512154 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1223b6d4-6430-46e3-8df7-de4952fe563e-kube-api-access-b4vj4" (OuterVolumeSpecName: "kube-api-access-b4vj4") pod "1223b6d4-6430-46e3-8df7-de4952fe563e" (UID: "1223b6d4-6430-46e3-8df7-de4952fe563e"). InnerVolumeSpecName "kube-api-access-b4vj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:31:53 crc kubenswrapper[4835]: I1003 18:31:53.607398 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4vj4\" (UniqueName: \"kubernetes.io/projected/1223b6d4-6430-46e3-8df7-de4952fe563e-kube-api-access-b4vj4\") on node \"crc\" DevicePath \"\"" Oct 03 18:31:53 crc kubenswrapper[4835]: I1003 18:31:53.992961 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t6nss" event={"ID":"1223b6d4-6430-46e3-8df7-de4952fe563e","Type":"ContainerDied","Data":"1c5730ef659ae67b5523afe7f2472b4be74eedda6cfea2c662bcf1942160c157"} Oct 03 18:31:53 crc kubenswrapper[4835]: I1003 18:31:53.993004 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c5730ef659ae67b5523afe7f2472b4be74eedda6cfea2c662bcf1942160c157" Oct 03 18:31:53 crc kubenswrapper[4835]: I1003 18:31:53.993048 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t6nss" Oct 03 18:31:56 crc kubenswrapper[4835]: I1003 18:31:56.961660 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c692-account-create-wrwnk"] Oct 03 18:31:56 crc kubenswrapper[4835]: E1003 18:31:56.962681 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9162685-16b5-45ab-824b-6d5cbd3e3d98" containerName="mariadb-database-create" Oct 03 18:31:56 crc kubenswrapper[4835]: I1003 18:31:56.962696 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9162685-16b5-45ab-824b-6d5cbd3e3d98" containerName="mariadb-database-create" Oct 03 18:31:56 crc kubenswrapper[4835]: E1003 18:31:56.962763 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1223b6d4-6430-46e3-8df7-de4952fe563e" containerName="mariadb-database-create" Oct 03 18:31:56 crc kubenswrapper[4835]: I1003 18:31:56.962771 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1223b6d4-6430-46e3-8df7-de4952fe563e" containerName="mariadb-database-create" Oct 03 18:31:56 crc kubenswrapper[4835]: E1003 18:31:56.962788 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc7fc2c2-1936-44ed-ba40-1baa10908df0" containerName="mariadb-database-create" Oct 03 18:31:56 crc kubenswrapper[4835]: I1003 18:31:56.962794 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc7fc2c2-1936-44ed-ba40-1baa10908df0" containerName="mariadb-database-create" Oct 03 18:31:56 crc kubenswrapper[4835]: I1003 18:31:56.962981 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1223b6d4-6430-46e3-8df7-de4952fe563e" containerName="mariadb-database-create" Oct 03 18:31:56 crc kubenswrapper[4835]: I1003 18:31:56.963006 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc7fc2c2-1936-44ed-ba40-1baa10908df0" containerName="mariadb-database-create" Oct 03 18:31:56 crc kubenswrapper[4835]: I1003 18:31:56.963016 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9162685-16b5-45ab-824b-6d5cbd3e3d98" containerName="mariadb-database-create" Oct 03 18:31:56 crc kubenswrapper[4835]: I1003 18:31:56.963725 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c692-account-create-wrwnk" Oct 03 18:31:56 crc kubenswrapper[4835]: I1003 18:31:56.966191 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 03 18:31:56 crc kubenswrapper[4835]: I1003 18:31:56.969783 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c692-account-create-wrwnk"] Oct 03 18:31:57 crc kubenswrapper[4835]: I1003 18:31:57.052111 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d472-account-create-4n9kl"] Oct 03 18:31:57 crc kubenswrapper[4835]: I1003 18:31:57.053396 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d472-account-create-4n9kl" Oct 03 18:31:57 crc kubenswrapper[4835]: I1003 18:31:57.055333 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 03 18:31:57 crc kubenswrapper[4835]: I1003 18:31:57.062825 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmcg2\" (UniqueName: \"kubernetes.io/projected/45a00bc2-b857-4530-8823-66a7204feade-kube-api-access-lmcg2\") pod \"cinder-c692-account-create-wrwnk\" (UID: \"45a00bc2-b857-4530-8823-66a7204feade\") " pod="openstack/cinder-c692-account-create-wrwnk" Oct 03 18:31:57 crc kubenswrapper[4835]: I1003 18:31:57.065248 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d472-account-create-4n9kl"] Oct 03 18:31:57 crc kubenswrapper[4835]: I1003 18:31:57.164783 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9mns\" (UniqueName: \"kubernetes.io/projected/e42d7b31-acc0-4ed1-87f5-3a2fad499556-kube-api-access-l9mns\") pod \"barbican-d472-account-create-4n9kl\" (UID: \"e42d7b31-acc0-4ed1-87f5-3a2fad499556\") " pod="openstack/barbican-d472-account-create-4n9kl" Oct 03 18:31:57 crc kubenswrapper[4835]: I1003 18:31:57.164880 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmcg2\" (UniqueName: \"kubernetes.io/projected/45a00bc2-b857-4530-8823-66a7204feade-kube-api-access-lmcg2\") pod \"cinder-c692-account-create-wrwnk\" (UID: \"45a00bc2-b857-4530-8823-66a7204feade\") " pod="openstack/cinder-c692-account-create-wrwnk" Oct 03 18:31:57 crc kubenswrapper[4835]: I1003 18:31:57.183541 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmcg2\" (UniqueName: \"kubernetes.io/projected/45a00bc2-b857-4530-8823-66a7204feade-kube-api-access-lmcg2\") pod \"cinder-c692-account-create-wrwnk\" (UID: \"45a00bc2-b857-4530-8823-66a7204feade\") " pod="openstack/cinder-c692-account-create-wrwnk" Oct 03 18:31:57 crc kubenswrapper[4835]: I1003 18:31:57.266000 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9mns\" (UniqueName: \"kubernetes.io/projected/e42d7b31-acc0-4ed1-87f5-3a2fad499556-kube-api-access-l9mns\") pod \"barbican-d472-account-create-4n9kl\" (UID: \"e42d7b31-acc0-4ed1-87f5-3a2fad499556\") " pod="openstack/barbican-d472-account-create-4n9kl" Oct 03 18:31:57 crc kubenswrapper[4835]: I1003 18:31:57.280558 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c692-account-create-wrwnk" Oct 03 18:31:57 crc kubenswrapper[4835]: I1003 18:31:57.283412 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9mns\" (UniqueName: \"kubernetes.io/projected/e42d7b31-acc0-4ed1-87f5-3a2fad499556-kube-api-access-l9mns\") pod \"barbican-d472-account-create-4n9kl\" (UID: \"e42d7b31-acc0-4ed1-87f5-3a2fad499556\") " pod="openstack/barbican-d472-account-create-4n9kl" Oct 03 18:31:57 crc kubenswrapper[4835]: I1003 18:31:57.369117 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d472-account-create-4n9kl" Oct 03 18:31:58 crc kubenswrapper[4835]: I1003 18:31:58.776968 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d472-account-create-4n9kl"] Oct 03 18:31:58 crc kubenswrapper[4835]: I1003 18:31:58.786419 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c692-account-create-wrwnk"] Oct 03 18:31:59 crc kubenswrapper[4835]: I1003 18:31:59.032749 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c692-account-create-wrwnk" event={"ID":"45a00bc2-b857-4530-8823-66a7204feade","Type":"ContainerStarted","Data":"078e39ddc273d8c4eb796cdaccebabc4766b1579c5e63c58c7ffbeeb3bad5f11"} Oct 03 18:31:59 crc kubenswrapper[4835]: I1003 18:31:59.032791 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c692-account-create-wrwnk" event={"ID":"45a00bc2-b857-4530-8823-66a7204feade","Type":"ContainerStarted","Data":"93aaae8ce1d715130a546a2e46b30c796566e7bc5ea0090f8ef92987cb38b108"} Oct 03 18:31:59 crc kubenswrapper[4835]: I1003 18:31:59.037556 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d472-account-create-4n9kl" event={"ID":"e42d7b31-acc0-4ed1-87f5-3a2fad499556","Type":"ContainerStarted","Data":"f3a8064d4600d03b601b52bd3666668b95f637711d2745fb936ce6168a9a53fd"} Oct 03 18:31:59 crc kubenswrapper[4835]: I1003 18:31:59.037605 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d472-account-create-4n9kl" event={"ID":"e42d7b31-acc0-4ed1-87f5-3a2fad499556","Type":"ContainerStarted","Data":"b9354bee0abbf8059dc3619148fb7ef29e53bfc6791f452d18d2748f3bee5b28"} Oct 03 18:31:59 crc kubenswrapper[4835]: I1003 18:31:59.044181 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9jg6f" event={"ID":"94622de9-5048-41be-875b-dc37acc7eba4","Type":"ContainerStarted","Data":"faf723616274a65eabc0b792453f25742b5b77ee440a051863d30601fd1f9525"} Oct 03 18:31:59 crc kubenswrapper[4835]: I1003 18:31:59.045960 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-9bqlb" event={"ID":"2f585da2-5074-43be-8260-e854b0e3b1a6","Type":"ContainerStarted","Data":"8cebce2b888e0113dc21d943570529659909ea96b53b980f985eb170ed64f72a"} Oct 03 18:31:59 crc kubenswrapper[4835]: I1003 18:31:59.048903 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-c692-account-create-wrwnk" podStartSLOduration=3.048875074 podStartE2EDuration="3.048875074s" podCreationTimestamp="2025-10-03 18:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:31:59.048601518 +0000 UTC m=+1060.764542400" watchObservedRunningTime="2025-10-03 18:31:59.048875074 +0000 UTC m=+1060.764815946" Oct 03 18:31:59 crc kubenswrapper[4835]: I1003 18:31:59.049966 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g9qx7" event={"ID":"07b0cbfd-5980-4408-927d-6e8b474a09a7","Type":"ContainerStarted","Data":"42490ccf585ca2dfb4dabd7c5c3d9cd9e8a3b01cface0dd3b9c29112b1c707c1"} Oct 03 18:31:59 crc kubenswrapper[4835]: I1003 18:31:59.088014 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9jg6f" podStartSLOduration=1.96824171 podStartE2EDuration="33.087998016s" podCreationTimestamp="2025-10-03 18:31:26 +0000 UTC" firstStartedPulling="2025-10-03 18:31:27.233447707 +0000 UTC m=+1028.949388579" lastFinishedPulling="2025-10-03 18:31:58.353204013 +0000 UTC m=+1060.069144885" observedRunningTime="2025-10-03 18:31:59.075904278 +0000 UTC m=+1060.791845150" watchObservedRunningTime="2025-10-03 18:31:59.087998016 +0000 UTC m=+1060.803938888" Oct 03 18:31:59 crc kubenswrapper[4835]: I1003 18:31:59.103139 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-9bqlb" podStartSLOduration=2.323155637 podStartE2EDuration="10.103121668s" podCreationTimestamp="2025-10-03 18:31:49 +0000 UTC" firstStartedPulling="2025-10-03 18:31:50.570823832 +0000 UTC m=+1052.286764704" lastFinishedPulling="2025-10-03 18:31:58.350789853 +0000 UTC m=+1060.066730735" observedRunningTime="2025-10-03 18:31:59.097697515 +0000 UTC m=+1060.813638407" watchObservedRunningTime="2025-10-03 18:31:59.103121668 +0000 UTC m=+1060.819062540" Oct 03 18:31:59 crc kubenswrapper[4835]: I1003 18:31:59.115483 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-g9qx7" podStartSLOduration=2.553647097 podStartE2EDuration="12.115460861s" podCreationTimestamp="2025-10-03 18:31:47 +0000 UTC" firstStartedPulling="2025-10-03 18:31:48.752621345 +0000 UTC m=+1050.468562217" lastFinishedPulling="2025-10-03 18:31:58.314435109 +0000 UTC m=+1060.030375981" observedRunningTime="2025-10-03 18:31:59.109992117 +0000 UTC m=+1060.825932989" watchObservedRunningTime="2025-10-03 18:31:59.115460861 +0000 UTC m=+1060.831401733" Oct 03 18:32:00 crc kubenswrapper[4835]: I1003 18:32:00.062941 4835 generic.go:334] "Generic (PLEG): container finished" podID="45a00bc2-b857-4530-8823-66a7204feade" containerID="078e39ddc273d8c4eb796cdaccebabc4766b1579c5e63c58c7ffbeeb3bad5f11" exitCode=0 Oct 03 18:32:00 crc kubenswrapper[4835]: I1003 18:32:00.063114 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c692-account-create-wrwnk" event={"ID":"45a00bc2-b857-4530-8823-66a7204feade","Type":"ContainerDied","Data":"078e39ddc273d8c4eb796cdaccebabc4766b1579c5e63c58c7ffbeeb3bad5f11"} Oct 03 18:32:00 crc kubenswrapper[4835]: I1003 18:32:00.066485 4835 generic.go:334] "Generic (PLEG): container finished" podID="e42d7b31-acc0-4ed1-87f5-3a2fad499556" containerID="f3a8064d4600d03b601b52bd3666668b95f637711d2745fb936ce6168a9a53fd" exitCode=0 Oct 03 18:32:00 crc kubenswrapper[4835]: I1003 18:32:00.066530 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d472-account-create-4n9kl" event={"ID":"e42d7b31-acc0-4ed1-87f5-3a2fad499556","Type":"ContainerDied","Data":"f3a8064d4600d03b601b52bd3666668b95f637711d2745fb936ce6168a9a53fd"} Oct 03 18:32:01 crc kubenswrapper[4835]: I1003 18:32:01.350032 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d472-account-create-4n9kl" Oct 03 18:32:01 crc kubenswrapper[4835]: I1003 18:32:01.433924 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9mns\" (UniqueName: \"kubernetes.io/projected/e42d7b31-acc0-4ed1-87f5-3a2fad499556-kube-api-access-l9mns\") pod \"e42d7b31-acc0-4ed1-87f5-3a2fad499556\" (UID: \"e42d7b31-acc0-4ed1-87f5-3a2fad499556\") " Oct 03 18:32:01 crc kubenswrapper[4835]: I1003 18:32:01.439350 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42d7b31-acc0-4ed1-87f5-3a2fad499556-kube-api-access-l9mns" (OuterVolumeSpecName: "kube-api-access-l9mns") pod "e42d7b31-acc0-4ed1-87f5-3a2fad499556" (UID: "e42d7b31-acc0-4ed1-87f5-3a2fad499556"). InnerVolumeSpecName "kube-api-access-l9mns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:01 crc kubenswrapper[4835]: I1003 18:32:01.511401 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c692-account-create-wrwnk" Oct 03 18:32:01 crc kubenswrapper[4835]: I1003 18:32:01.535767 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9mns\" (UniqueName: \"kubernetes.io/projected/e42d7b31-acc0-4ed1-87f5-3a2fad499556-kube-api-access-l9mns\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:01 crc kubenswrapper[4835]: I1003 18:32:01.637006 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmcg2\" (UniqueName: \"kubernetes.io/projected/45a00bc2-b857-4530-8823-66a7204feade-kube-api-access-lmcg2\") pod \"45a00bc2-b857-4530-8823-66a7204feade\" (UID: \"45a00bc2-b857-4530-8823-66a7204feade\") " Oct 03 18:32:01 crc kubenswrapper[4835]: I1003 18:32:01.641341 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45a00bc2-b857-4530-8823-66a7204feade-kube-api-access-lmcg2" (OuterVolumeSpecName: "kube-api-access-lmcg2") pod "45a00bc2-b857-4530-8823-66a7204feade" (UID: "45a00bc2-b857-4530-8823-66a7204feade"). InnerVolumeSpecName "kube-api-access-lmcg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:01 crc kubenswrapper[4835]: I1003 18:32:01.738817 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmcg2\" (UniqueName: \"kubernetes.io/projected/45a00bc2-b857-4530-8823-66a7204feade-kube-api-access-lmcg2\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:02 crc kubenswrapper[4835]: I1003 18:32:02.082856 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c692-account-create-wrwnk" event={"ID":"45a00bc2-b857-4530-8823-66a7204feade","Type":"ContainerDied","Data":"93aaae8ce1d715130a546a2e46b30c796566e7bc5ea0090f8ef92987cb38b108"} Oct 03 18:32:02 crc kubenswrapper[4835]: I1003 18:32:02.083147 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93aaae8ce1d715130a546a2e46b30c796566e7bc5ea0090f8ef92987cb38b108" Oct 03 18:32:02 crc kubenswrapper[4835]: I1003 18:32:02.082900 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c692-account-create-wrwnk" Oct 03 18:32:02 crc kubenswrapper[4835]: I1003 18:32:02.084700 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d472-account-create-4n9kl" event={"ID":"e42d7b31-acc0-4ed1-87f5-3a2fad499556","Type":"ContainerDied","Data":"b9354bee0abbf8059dc3619148fb7ef29e53bfc6791f452d18d2748f3bee5b28"} Oct 03 18:32:02 crc kubenswrapper[4835]: I1003 18:32:02.084750 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9354bee0abbf8059dc3619148fb7ef29e53bfc6791f452d18d2748f3bee5b28" Oct 03 18:32:02 crc kubenswrapper[4835]: I1003 18:32:02.084753 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d472-account-create-4n9kl" Oct 03 18:32:04 crc kubenswrapper[4835]: I1003 18:32:04.098805 4835 generic.go:334] "Generic (PLEG): container finished" podID="2f585da2-5074-43be-8260-e854b0e3b1a6" containerID="8cebce2b888e0113dc21d943570529659909ea96b53b980f985eb170ed64f72a" exitCode=0 Oct 03 18:32:04 crc kubenswrapper[4835]: I1003 18:32:04.098877 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-9bqlb" event={"ID":"2f585da2-5074-43be-8260-e854b0e3b1a6","Type":"ContainerDied","Data":"8cebce2b888e0113dc21d943570529659909ea96b53b980f985eb170ed64f72a"} Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.129471 4835 generic.go:334] "Generic (PLEG): container finished" podID="07b0cbfd-5980-4408-927d-6e8b474a09a7" containerID="42490ccf585ca2dfb4dabd7c5c3d9cd9e8a3b01cface0dd3b9c29112b1c707c1" exitCode=0 Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.129623 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g9qx7" event={"ID":"07b0cbfd-5980-4408-927d-6e8b474a09a7","Type":"ContainerDied","Data":"42490ccf585ca2dfb4dabd7c5c3d9cd9e8a3b01cface0dd3b9c29112b1c707c1"} Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.449705 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.601903 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-db-sync-config-data\") pod \"2f585da2-5074-43be-8260-e854b0e3b1a6\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.601978 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-config-data\") pod \"2f585da2-5074-43be-8260-e854b0e3b1a6\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.602164 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-combined-ca-bundle\") pod \"2f585da2-5074-43be-8260-e854b0e3b1a6\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.602386 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27fx8\" (UniqueName: \"kubernetes.io/projected/2f585da2-5074-43be-8260-e854b0e3b1a6-kube-api-access-27fx8\") pod \"2f585da2-5074-43be-8260-e854b0e3b1a6\" (UID: \"2f585da2-5074-43be-8260-e854b0e3b1a6\") " Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.607440 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f585da2-5074-43be-8260-e854b0e3b1a6-kube-api-access-27fx8" (OuterVolumeSpecName: "kube-api-access-27fx8") pod "2f585da2-5074-43be-8260-e854b0e3b1a6" (UID: "2f585da2-5074-43be-8260-e854b0e3b1a6"). InnerVolumeSpecName "kube-api-access-27fx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.613568 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2f585da2-5074-43be-8260-e854b0e3b1a6" (UID: "2f585da2-5074-43be-8260-e854b0e3b1a6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.627680 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f585da2-5074-43be-8260-e854b0e3b1a6" (UID: "2f585da2-5074-43be-8260-e854b0e3b1a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.647147 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-config-data" (OuterVolumeSpecName: "config-data") pod "2f585da2-5074-43be-8260-e854b0e3b1a6" (UID: "2f585da2-5074-43be-8260-e854b0e3b1a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.704532 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.704570 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27fx8\" (UniqueName: \"kubernetes.io/projected/2f585da2-5074-43be-8260-e854b0e3b1a6-kube-api-access-27fx8\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.704586 4835 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:05 crc kubenswrapper[4835]: I1003 18:32:05.704598 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f585da2-5074-43be-8260-e854b0e3b1a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:06 crc kubenswrapper[4835]: I1003 18:32:06.140913 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-9bqlb" event={"ID":"2f585da2-5074-43be-8260-e854b0e3b1a6","Type":"ContainerDied","Data":"15541d07176e536500a59b73c015f0fe73a9bae24ec78773251fd0b7aff13959"} Oct 03 18:32:06 crc kubenswrapper[4835]: I1003 18:32:06.141268 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15541d07176e536500a59b73c015f0fe73a9bae24ec78773251fd0b7aff13959" Oct 03 18:32:06 crc kubenswrapper[4835]: I1003 18:32:06.140940 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-9bqlb" Oct 03 18:32:06 crc kubenswrapper[4835]: I1003 18:32:06.474917 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g9qx7" Oct 03 18:32:06 crc kubenswrapper[4835]: I1003 18:32:06.616105 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b0cbfd-5980-4408-927d-6e8b474a09a7-combined-ca-bundle\") pod \"07b0cbfd-5980-4408-927d-6e8b474a09a7\" (UID: \"07b0cbfd-5980-4408-927d-6e8b474a09a7\") " Oct 03 18:32:06 crc kubenswrapper[4835]: I1003 18:32:06.616245 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07b0cbfd-5980-4408-927d-6e8b474a09a7-config-data\") pod \"07b0cbfd-5980-4408-927d-6e8b474a09a7\" (UID: \"07b0cbfd-5980-4408-927d-6e8b474a09a7\") " Oct 03 18:32:06 crc kubenswrapper[4835]: I1003 18:32:06.616305 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh5vq\" (UniqueName: \"kubernetes.io/projected/07b0cbfd-5980-4408-927d-6e8b474a09a7-kube-api-access-kh5vq\") pod \"07b0cbfd-5980-4408-927d-6e8b474a09a7\" (UID: \"07b0cbfd-5980-4408-927d-6e8b474a09a7\") " Oct 03 18:32:06 crc kubenswrapper[4835]: I1003 18:32:06.625772 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07b0cbfd-5980-4408-927d-6e8b474a09a7-kube-api-access-kh5vq" (OuterVolumeSpecName: "kube-api-access-kh5vq") pod "07b0cbfd-5980-4408-927d-6e8b474a09a7" (UID: "07b0cbfd-5980-4408-927d-6e8b474a09a7"). InnerVolumeSpecName "kube-api-access-kh5vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:06 crc kubenswrapper[4835]: I1003 18:32:06.647945 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07b0cbfd-5980-4408-927d-6e8b474a09a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07b0cbfd-5980-4408-927d-6e8b474a09a7" (UID: "07b0cbfd-5980-4408-927d-6e8b474a09a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:06 crc kubenswrapper[4835]: I1003 18:32:06.676695 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07b0cbfd-5980-4408-927d-6e8b474a09a7-config-data" (OuterVolumeSpecName: "config-data") pod "07b0cbfd-5980-4408-927d-6e8b474a09a7" (UID: "07b0cbfd-5980-4408-927d-6e8b474a09a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:06 crc kubenswrapper[4835]: I1003 18:32:06.720142 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07b0cbfd-5980-4408-927d-6e8b474a09a7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:06 crc kubenswrapper[4835]: I1003 18:32:06.720177 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh5vq\" (UniqueName: \"kubernetes.io/projected/07b0cbfd-5980-4408-927d-6e8b474a09a7-kube-api-access-kh5vq\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:06 crc kubenswrapper[4835]: I1003 18:32:06.720191 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b0cbfd-5980-4408-927d-6e8b474a09a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.149786 4835 generic.go:334] "Generic (PLEG): container finished" podID="94622de9-5048-41be-875b-dc37acc7eba4" containerID="faf723616274a65eabc0b792453f25742b5b77ee440a051863d30601fd1f9525" exitCode=0 Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.149915 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9jg6f" event={"ID":"94622de9-5048-41be-875b-dc37acc7eba4","Type":"ContainerDied","Data":"faf723616274a65eabc0b792453f25742b5b77ee440a051863d30601fd1f9525"} Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.152540 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-g9qx7" event={"ID":"07b0cbfd-5980-4408-927d-6e8b474a09a7","Type":"ContainerDied","Data":"fcc58601b86afad184b5a47faea83f2cc458b732e913e2ef38ee284023736cf4"} Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.152582 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcc58601b86afad184b5a47faea83f2cc458b732e913e2ef38ee284023736cf4" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.152604 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-g9qx7" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.244592 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c3bd-account-create-cvkm5"] Oct 03 18:32:07 crc kubenswrapper[4835]: E1003 18:32:07.244951 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f585da2-5074-43be-8260-e854b0e3b1a6" containerName="watcher-db-sync" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.244967 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f585da2-5074-43be-8260-e854b0e3b1a6" containerName="watcher-db-sync" Oct 03 18:32:07 crc kubenswrapper[4835]: E1003 18:32:07.244992 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a00bc2-b857-4530-8823-66a7204feade" containerName="mariadb-account-create" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.244999 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a00bc2-b857-4530-8823-66a7204feade" containerName="mariadb-account-create" Oct 03 18:32:07 crc kubenswrapper[4835]: E1003 18:32:07.245013 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b0cbfd-5980-4408-927d-6e8b474a09a7" containerName="keystone-db-sync" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.245019 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b0cbfd-5980-4408-927d-6e8b474a09a7" containerName="keystone-db-sync" Oct 03 18:32:07 crc kubenswrapper[4835]: E1003 18:32:07.245034 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42d7b31-acc0-4ed1-87f5-3a2fad499556" containerName="mariadb-account-create" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.245041 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42d7b31-acc0-4ed1-87f5-3a2fad499556" containerName="mariadb-account-create" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.245228 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f585da2-5074-43be-8260-e854b0e3b1a6" containerName="watcher-db-sync" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.245246 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="07b0cbfd-5980-4408-927d-6e8b474a09a7" containerName="keystone-db-sync" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.245265 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42d7b31-acc0-4ed1-87f5-3a2fad499556" containerName="mariadb-account-create" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.245275 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="45a00bc2-b857-4530-8823-66a7204feade" containerName="mariadb-account-create" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.245842 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c3bd-account-create-cvkm5" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.248252 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.255548 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c3bd-account-create-cvkm5"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.330780 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8qkm\" (UniqueName: \"kubernetes.io/projected/87bc8f1a-de34-4448-bc20-10e5b92907e6-kube-api-access-j8qkm\") pod \"neutron-c3bd-account-create-cvkm5\" (UID: \"87bc8f1a-de34-4448-bc20-10e5b92907e6\") " pod="openstack/neutron-c3bd-account-create-cvkm5" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.345164 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c4c45f94f-2g4gf"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.346506 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.374634 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c4c45f94f-2g4gf"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.432275 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plrmv\" (UniqueName: \"kubernetes.io/projected/cf560b83-49ed-48b6-b61f-d59166907390-kube-api-access-plrmv\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.432363 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-ovsdbserver-sb\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.432397 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-dns-swift-storage-0\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.432444 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-ovsdbserver-nb\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.432469 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-config\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.432493 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8qkm\" (UniqueName: \"kubernetes.io/projected/87bc8f1a-de34-4448-bc20-10e5b92907e6-kube-api-access-j8qkm\") pod \"neutron-c3bd-account-create-cvkm5\" (UID: \"87bc8f1a-de34-4448-bc20-10e5b92907e6\") " pod="openstack/neutron-c3bd-account-create-cvkm5" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.432524 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-dns-svc\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.439001 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-f9k44"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.440189 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.451596 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.452099 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.452175 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vd8gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.452310 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.460932 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8qkm\" (UniqueName: \"kubernetes.io/projected/87bc8f1a-de34-4448-bc20-10e5b92907e6-kube-api-access-j8qkm\") pod \"neutron-c3bd-account-create-cvkm5\" (UID: \"87bc8f1a-de34-4448-bc20-10e5b92907e6\") " pod="openstack/neutron-c3bd-account-create-cvkm5" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.498044 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f9k44"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.523064 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.524490 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.536494 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-plmhx" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.536533 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.537450 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-dns-svc\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.537514 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-fernet-keys\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.537535 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26qgm\" (UniqueName: \"kubernetes.io/projected/2cb51f11-d7f5-46ed-825f-6ca8c530094b-kube-api-access-26qgm\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.537556 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-credential-keys\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.537604 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plrmv\" (UniqueName: \"kubernetes.io/projected/cf560b83-49ed-48b6-b61f-d59166907390-kube-api-access-plrmv\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.537620 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-scripts\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.537636 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-config-data\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.537653 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-combined-ca-bundle\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.537694 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-ovsdbserver-sb\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.537712 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-dns-swift-storage-0\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.537750 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-ovsdbserver-nb\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.537773 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-config\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.538635 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-config\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.539222 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-dns-svc\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.540012 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-ovsdbserver-sb\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.540521 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-dns-swift-storage-0\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.541028 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-ovsdbserver-nb\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.541223 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.542612 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.548418 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.562133 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.564210 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c3bd-account-create-cvkm5" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.575112 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.614284 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.616310 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.636168 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plrmv\" (UniqueName: \"kubernetes.io/projected/cf560b83-49ed-48b6-b61f-d59166907390-kube-api-access-plrmv\") pod \"dnsmasq-dns-7c4c45f94f-2g4gf\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.665105 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.665847 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-scripts\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.666049 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-config-data\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.670267 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.674241 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.682368 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-combined-ca-bundle\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.683820 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4870398d-de86-4dc6-9052-b6e80bfe27f5-logs\") pod \"watcher-applier-0\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " pod="openstack/watcher-applier-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.683863 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4870398d-de86-4dc6-9052-b6e80bfe27f5-config-data\") pod \"watcher-applier-0\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " pod="openstack/watcher-applier-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.702128 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.715320 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-scripts\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.683968 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.723694 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8110d0e5-9e19-4306-b8aa-babe937e8d2a-logs\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.723774 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4870398d-de86-4dc6-9052-b6e80bfe27f5-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " pod="openstack/watcher-applier-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.723809 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjdxn\" (UniqueName: \"kubernetes.io/projected/8110d0e5-9e19-4306-b8aa-babe937e8d2a-kube-api-access-hjdxn\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.723851 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgn45\" (UniqueName: \"kubernetes.io/projected/4870398d-de86-4dc6-9052-b6e80bfe27f5-kube-api-access-pgn45\") pod \"watcher-applier-0\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " pod="openstack/watcher-applier-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.723917 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.723962 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-fernet-keys\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.723993 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26qgm\" (UniqueName: \"kubernetes.io/projected/2cb51f11-d7f5-46ed-825f-6ca8c530094b-kube-api-access-26qgm\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.724013 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-credential-keys\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.730857 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-fernet-keys\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.731278 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-config-data\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.740833 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-credential-keys\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.747718 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-combined-ca-bundle\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.779159 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-dpk2w"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.780418 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.793159 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x2whm" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.793468 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.793640 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26qgm\" (UniqueName: \"kubernetes.io/projected/2cb51f11-d7f5-46ed-825f-6ca8c530094b-kube-api-access-26qgm\") pod \"keystone-bootstrap-f9k44\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.794449 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.818363 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.825827 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.825879 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5z7m\" (UniqueName: \"kubernetes.io/projected/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-kube-api-access-b5z7m\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.825909 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4870398d-de86-4dc6-9052-b6e80bfe27f5-logs\") pod \"watcher-applier-0\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " pod="openstack/watcher-applier-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.825934 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-config-data\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.825961 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4870398d-de86-4dc6-9052-b6e80bfe27f5-config-data\") pod \"watcher-applier-0\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " pod="openstack/watcher-applier-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.825984 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.826042 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8110d0e5-9e19-4306-b8aa-babe937e8d2a-logs\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.826092 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4870398d-de86-4dc6-9052-b6e80bfe27f5-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " pod="openstack/watcher-applier-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.826112 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjdxn\" (UniqueName: \"kubernetes.io/projected/8110d0e5-9e19-4306-b8aa-babe937e8d2a-kube-api-access-hjdxn\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.826131 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgn45\" (UniqueName: \"kubernetes.io/projected/4870398d-de86-4dc6-9052-b6e80bfe27f5-kube-api-access-pgn45\") pod \"watcher-applier-0\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " pod="openstack/watcher-applier-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.826150 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.826183 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.826222 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.826252 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-logs\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.828579 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4870398d-de86-4dc6-9052-b6e80bfe27f5-logs\") pod \"watcher-applier-0\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " pod="openstack/watcher-applier-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.830738 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.831144 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4870398d-de86-4dc6-9052-b6e80bfe27f5-config-data\") pod \"watcher-applier-0\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " pod="openstack/watcher-applier-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.835788 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8110d0e5-9e19-4306-b8aa-babe937e8d2a-logs\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.854544 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dpk2w"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.856153 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.856359 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.863058 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4870398d-de86-4dc6-9052-b6e80bfe27f5-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " pod="openstack/watcher-applier-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.869165 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjdxn\" (UniqueName: \"kubernetes.io/projected/8110d0e5-9e19-4306-b8aa-babe937e8d2a-kube-api-access-hjdxn\") pod \"watcher-decision-engine-0\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.883973 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-59ddd97667-jf4l4"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.886678 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.889371 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.889528 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-c8ktk" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.890318 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgn45\" (UniqueName: \"kubernetes.io/projected/4870398d-de86-4dc6-9052-b6e80bfe27f5-kube-api-access-pgn45\") pod \"watcher-applier-0\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " pod="openstack/watcher-applier-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.893372 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.893607 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.904580 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.908544 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.918099 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.918307 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.929151 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.929725 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-db-sync-config-data\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.929767 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.929799 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22cdq\" (UniqueName: \"kubernetes.io/projected/705966b1-0d0b-4c12-9cc1-830277fcf80c-kube-api-access-22cdq\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.929819 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-config-data\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.929846 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-logs\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.929867 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-scripts\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.929898 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5z7m\" (UniqueName: \"kubernetes.io/projected/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-kube-api-access-b5z7m\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.929920 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-config-data\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.929979 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-combined-ca-bundle\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.930019 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/705966b1-0d0b-4c12-9cc1-830277fcf80c-etc-machine-id\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.930042 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.931014 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-logs\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.934644 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59ddd97667-jf4l4"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.943759 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.943830 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-fz6rs"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.944852 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fz6rs" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.946265 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.951617 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6gnk2" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.951794 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.952243 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c45f94f-2g4gf"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.970676 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5z7m\" (UniqueName: \"kubernetes.io/projected/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-kube-api-access-b5z7m\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.987706 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-config-data\") pod \"watcher-api-0\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " pod="openstack/watcher-api-0" Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.988115 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fz6rs"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.998134 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-xkgv4"] Oct 03 18:32:07 crc kubenswrapper[4835]: I1003 18:32:07.999357 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.001822 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.002594 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7wl7m" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.002685 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.016170 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xkgv4"] Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.026135 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c4cb4cdd5-hjjlp"] Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.027669 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031604 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/709d622b-7993-4d18-8185-10b4f1c81d79-db-sync-config-data\") pod \"barbican-db-sync-fz6rs\" (UID: \"709d622b-7993-4d18-8185-10b4f1c81d79\") " pod="openstack/barbican-db-sync-fz6rs" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031650 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-combined-ca-bundle\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031683 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af75f57a-7612-48c8-b3fb-8594e81e2d0a-run-httpd\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031706 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd72585e-bb84-41a5-bfde-d55a3978c294-config-data\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031723 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/705966b1-0d0b-4c12-9cc1-830277fcf80c-etc-machine-id\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031750 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af75f57a-7612-48c8-b3fb-8594e81e2d0a-log-httpd\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031766 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd72585e-bb84-41a5-bfde-d55a3978c294-logs\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031784 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709d622b-7993-4d18-8185-10b4f1c81d79-combined-ca-bundle\") pod \"barbican-db-sync-fz6rs\" (UID: \"709d622b-7993-4d18-8185-10b4f1c81d79\") " pod="openstack/barbican-db-sync-fz6rs" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031806 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031824 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-db-sync-config-data\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031842 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4l2\" (UniqueName: \"kubernetes.io/projected/709d622b-7993-4d18-8185-10b4f1c81d79-kube-api-access-qb4l2\") pod \"barbican-db-sync-fz6rs\" (UID: \"709d622b-7993-4d18-8185-10b4f1c81d79\") " pod="openstack/barbican-db-sync-fz6rs" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031857 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031880 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22cdq\" (UniqueName: \"kubernetes.io/projected/705966b1-0d0b-4c12-9cc1-830277fcf80c-kube-api-access-22cdq\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031898 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-config-data\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031916 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd72585e-bb84-41a5-bfde-d55a3978c294-scripts\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031935 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd72585e-bb84-41a5-bfde-d55a3978c294-horizon-secret-key\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031952 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-scripts\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031976 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-scripts\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.031988 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-config-data\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.032029 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfv8r\" (UniqueName: \"kubernetes.io/projected/af75f57a-7612-48c8-b3fb-8594e81e2d0a-kube-api-access-xfv8r\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.032053 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsjq6\" (UniqueName: \"kubernetes.io/projected/fd72585e-bb84-41a5-bfde-d55a3978c294-kube-api-access-wsjq6\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.033051 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/705966b1-0d0b-4c12-9cc1-830277fcf80c-etc-machine-id\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.039571 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-db-sync-config-data\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.041722 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-combined-ca-bundle\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.041789 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c4cb4cdd5-hjjlp"] Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.042118 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-config-data\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.044218 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-scripts\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.058805 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22cdq\" (UniqueName: \"kubernetes.io/projected/705966b1-0d0b-4c12-9cc1-830277fcf80c-kube-api-access-22cdq\") pod \"cinder-db-sync-dpk2w\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.075162 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57778c7597-llbt4"] Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.076728 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.105093 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57778c7597-llbt4"] Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.123291 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133266 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-dns-swift-storage-0\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133316 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-config-data\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133346 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb4xd\" (UniqueName: \"kubernetes.io/projected/6689e7e5-6421-4939-b68b-d93c54479b72-kube-api-access-rb4xd\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133386 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af75f57a-7612-48c8-b3fb-8594e81e2d0a-log-httpd\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133416 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd72585e-bb84-41a5-bfde-d55a3978c294-logs\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133441 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709d622b-7993-4d18-8185-10b4f1c81d79-combined-ca-bundle\") pod \"barbican-db-sync-fz6rs\" (UID: \"709d622b-7993-4d18-8185-10b4f1c81d79\") " pod="openstack/barbican-db-sync-fz6rs" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133495 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133547 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-combined-ca-bundle\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133575 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4l2\" (UniqueName: \"kubernetes.io/projected/709d622b-7993-4d18-8185-10b4f1c81d79-kube-api-access-qb4l2\") pod \"barbican-db-sync-fz6rs\" (UID: \"709d622b-7993-4d18-8185-10b4f1c81d79\") " pod="openstack/barbican-db-sync-fz6rs" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133595 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133627 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd72585e-bb84-41a5-bfde-d55a3978c294-scripts\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133651 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd72585e-bb84-41a5-bfde-d55a3978c294-horizon-secret-key\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133680 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-scripts\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133713 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-scripts\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133733 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-config-data\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133783 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-ovsdbserver-nb\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133821 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfv8r\" (UniqueName: \"kubernetes.io/projected/af75f57a-7612-48c8-b3fb-8594e81e2d0a-kube-api-access-xfv8r\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133849 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-dns-svc\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133871 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb5zf\" (UniqueName: \"kubernetes.io/projected/ddee52bd-4539-46b6-a51f-50fe9278665a-kube-api-access-bb5zf\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133896 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsjq6\" (UniqueName: \"kubernetes.io/projected/fd72585e-bb84-41a5-bfde-d55a3978c294-kube-api-access-wsjq6\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133924 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-ovsdbserver-sb\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133948 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/709d622b-7993-4d18-8185-10b4f1c81d79-db-sync-config-data\") pod \"barbican-db-sync-fz6rs\" (UID: \"709d622b-7993-4d18-8185-10b4f1c81d79\") " pod="openstack/barbican-db-sync-fz6rs" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.133983 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-config\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.134002 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddee52bd-4539-46b6-a51f-50fe9278665a-logs\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.134035 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af75f57a-7612-48c8-b3fb-8594e81e2d0a-run-httpd\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.134059 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd72585e-bb84-41a5-bfde-d55a3978c294-config-data\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.136408 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd72585e-bb84-41a5-bfde-d55a3978c294-config-data\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.136776 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af75f57a-7612-48c8-b3fb-8594e81e2d0a-run-httpd\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.136796 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709d622b-7993-4d18-8185-10b4f1c81d79-combined-ca-bundle\") pod \"barbican-db-sync-fz6rs\" (UID: \"709d622b-7993-4d18-8185-10b4f1c81d79\") " pod="openstack/barbican-db-sync-fz6rs" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.136956 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af75f57a-7612-48c8-b3fb-8594e81e2d0a-log-httpd\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.137235 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd72585e-bb84-41a5-bfde-d55a3978c294-logs\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.137326 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd72585e-bb84-41a5-bfde-d55a3978c294-scripts\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.141105 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-scripts\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.141632 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.142487 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd72585e-bb84-41a5-bfde-d55a3978c294-horizon-secret-key\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.143538 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/709d622b-7993-4d18-8185-10b4f1c81d79-db-sync-config-data\") pod \"barbican-db-sync-fz6rs\" (UID: \"709d622b-7993-4d18-8185-10b4f1c81d79\") " pod="openstack/barbican-db-sync-fz6rs" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.143890 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-config-data\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.153736 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.160155 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsjq6\" (UniqueName: \"kubernetes.io/projected/fd72585e-bb84-41a5-bfde-d55a3978c294-kube-api-access-wsjq6\") pod \"horizon-59ddd97667-jf4l4\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.157061 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.159322 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfv8r\" (UniqueName: \"kubernetes.io/projected/af75f57a-7612-48c8-b3fb-8594e81e2d0a-kube-api-access-xfv8r\") pod \"ceilometer-0\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.159377 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4l2\" (UniqueName: \"kubernetes.io/projected/709d622b-7993-4d18-8185-10b4f1c81d79-kube-api-access-qb4l2\") pod \"barbican-db-sync-fz6rs\" (UID: \"709d622b-7993-4d18-8185-10b4f1c81d79\") " pod="openstack/barbican-db-sync-fz6rs" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.155778 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.165199 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.208581 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238135 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-ovsdbserver-nb\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238224 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-dns-svc\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238254 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb5zf\" (UniqueName: \"kubernetes.io/projected/ddee52bd-4539-46b6-a51f-50fe9278665a-kube-api-access-bb5zf\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238292 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-ovsdbserver-sb\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238333 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-config\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238361 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddee52bd-4539-46b6-a51f-50fe9278665a-logs\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238396 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-scripts\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238433 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8djxn\" (UniqueName: \"kubernetes.io/projected/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-kube-api-access-8djxn\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238462 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-horizon-secret-key\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238488 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-config-data\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238517 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-dns-swift-storage-0\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238546 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-config-data\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238569 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb4xd\" (UniqueName: \"kubernetes.io/projected/6689e7e5-6421-4939-b68b-d93c54479b72-kube-api-access-rb4xd\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238623 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-combined-ca-bundle\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238653 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-logs\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.238701 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-scripts\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.239952 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-ovsdbserver-nb\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.247132 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddee52bd-4539-46b6-a51f-50fe9278665a-logs\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.247784 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-ovsdbserver-sb\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.249050 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-dns-swift-storage-0\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.249716 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-dns-svc\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.249762 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-config-data\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.250708 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-config\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.255187 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-scripts\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.258255 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-combined-ca-bundle\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.267766 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb5zf\" (UniqueName: \"kubernetes.io/projected/ddee52bd-4539-46b6-a51f-50fe9278665a-kube-api-access-bb5zf\") pod \"placement-db-sync-xkgv4\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.270019 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb4xd\" (UniqueName: \"kubernetes.io/projected/6689e7e5-6421-4939-b68b-d93c54479b72-kube-api-access-rb4xd\") pod \"dnsmasq-dns-c4cb4cdd5-hjjlp\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.271733 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.315126 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fz6rs" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.322123 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.344718 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-logs\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.344843 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-scripts\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.344875 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8djxn\" (UniqueName: \"kubernetes.io/projected/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-kube-api-access-8djxn\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.344897 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-horizon-secret-key\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.344914 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-config-data\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.346702 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-scripts\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.346957 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-logs\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.348242 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-config-data\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.351422 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.355598 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-horizon-secret-key\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.362301 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8djxn\" (UniqueName: \"kubernetes.io/projected/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-kube-api-access-8djxn\") pod \"horizon-57778c7597-llbt4\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.366100 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c45f94f-2g4gf"] Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.378990 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c3bd-account-create-cvkm5"] Oct 03 18:32:08 crc kubenswrapper[4835]: W1003 18:32:08.402054 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf560b83_49ed_48b6_b61f_d59166907390.slice/crio-11238667055d249574d07a3b1dc56eaea70697ad83a560846863a8b1e9afc40c WatchSource:0}: Error finding container 11238667055d249574d07a3b1dc56eaea70697ad83a560846863a8b1e9afc40c: Status 404 returned error can't find the container with id 11238667055d249574d07a3b1dc56eaea70697ad83a560846863a8b1e9afc40c Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.411456 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.594910 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f9k44"] Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.684843 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:32:08 crc kubenswrapper[4835]: W1003 18:32:08.720223 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cb51f11_d7f5_46ed_825f_6ca8c530094b.slice/crio-e9fda921116b2d41f9a2532bcbe6cc6fe8ed9b7659457e530a2cfe02590efd0e WatchSource:0}: Error finding container e9fda921116b2d41f9a2532bcbe6cc6fe8ed9b7659457e530a2cfe02590efd0e: Status 404 returned error can't find the container with id e9fda921116b2d41f9a2532bcbe6cc6fe8ed9b7659457e530a2cfe02590efd0e Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.816310 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9jg6f" Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.975928 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtrrk\" (UniqueName: \"kubernetes.io/projected/94622de9-5048-41be-875b-dc37acc7eba4-kube-api-access-mtrrk\") pod \"94622de9-5048-41be-875b-dc37acc7eba4\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.976129 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-config-data\") pod \"94622de9-5048-41be-875b-dc37acc7eba4\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.976194 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-db-sync-config-data\") pod \"94622de9-5048-41be-875b-dc37acc7eba4\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " Oct 03 18:32:08 crc kubenswrapper[4835]: I1003 18:32:08.976249 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-combined-ca-bundle\") pod \"94622de9-5048-41be-875b-dc37acc7eba4\" (UID: \"94622de9-5048-41be-875b-dc37acc7eba4\") " Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.007490 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "94622de9-5048-41be-875b-dc37acc7eba4" (UID: "94622de9-5048-41be-875b-dc37acc7eba4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.007837 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94622de9-5048-41be-875b-dc37acc7eba4-kube-api-access-mtrrk" (OuterVolumeSpecName: "kube-api-access-mtrrk") pod "94622de9-5048-41be-875b-dc37acc7eba4" (UID: "94622de9-5048-41be-875b-dc37acc7eba4"). InnerVolumeSpecName "kube-api-access-mtrrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.088800 4835 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.088840 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtrrk\" (UniqueName: \"kubernetes.io/projected/94622de9-5048-41be-875b-dc37acc7eba4-kube-api-access-mtrrk\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.207554 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94622de9-5048-41be-875b-dc37acc7eba4" (UID: "94622de9-5048-41be-875b-dc37acc7eba4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.210833 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" event={"ID":"cf560b83-49ed-48b6-b61f-d59166907390","Type":"ContainerStarted","Data":"2f0830d1e26fbae5c67dd32c8c6bc93a219e35e8042df43e05f20af0d8134982"} Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.212126 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" podUID="cf560b83-49ed-48b6-b61f-d59166907390" containerName="init" containerID="cri-o://2f0830d1e26fbae5c67dd32c8c6bc93a219e35e8042df43e05f20af0d8134982" gracePeriod=10 Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.213870 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" event={"ID":"cf560b83-49ed-48b6-b61f-d59166907390","Type":"ContainerStarted","Data":"11238667055d249574d07a3b1dc56eaea70697ad83a560846863a8b1e9afc40c"} Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.217224 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"1b461fe4-edb9-423d-b17e-7cb251f7fc0d","Type":"ContainerStarted","Data":"30a15d82c21e1ce46eb6850426cfb175e63efa5c262a63942dcd598e85437e24"} Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.218486 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9jg6f" event={"ID":"94622de9-5048-41be-875b-dc37acc7eba4","Type":"ContainerDied","Data":"97fa768958ebc3c8836b5b5440f17ff91dd92ef1d5d93c94f91358f301a3a0be"} Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.218507 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97fa768958ebc3c8836b5b5440f17ff91dd92ef1d5d93c94f91358f301a3a0be" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.218557 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9jg6f" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.220943 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f9k44" event={"ID":"2cb51f11-d7f5-46ed-825f-6ca8c530094b","Type":"ContainerStarted","Data":"e9fda921116b2d41f9a2532bcbe6cc6fe8ed9b7659457e530a2cfe02590efd0e"} Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.222292 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c3bd-account-create-cvkm5" event={"ID":"87bc8f1a-de34-4448-bc20-10e5b92907e6","Type":"ContainerStarted","Data":"341b0758340c4d6c6c5b16d296fefbaf5bbab98888255cd0a922ee3e3471be20"} Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.222311 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c3bd-account-create-cvkm5" event={"ID":"87bc8f1a-de34-4448-bc20-10e5b92907e6","Type":"ContainerStarted","Data":"554172fc7ed0f185a4e38fef3d02523f085004c60a5f09b9170b8cbe8025de2d"} Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.299842 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.311512 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c3bd-account-create-cvkm5" podStartSLOduration=2.311492527 podStartE2EDuration="2.311492527s" podCreationTimestamp="2025-10-03 18:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:09.310702017 +0000 UTC m=+1071.026642889" watchObservedRunningTime="2025-10-03 18:32:09.311492527 +0000 UTC m=+1071.027433399" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.327707 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-config-data" (OuterVolumeSpecName: "config-data") pod "94622de9-5048-41be-875b-dc37acc7eba4" (UID: "94622de9-5048-41be-875b-dc37acc7eba4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.401378 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94622de9-5048-41be-875b-dc37acc7eba4-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.475530 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59ddd97667-jf4l4"] Oct 03 18:32:09 crc kubenswrapper[4835]: W1003 18:32:09.491426 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd72585e_bb84_41a5_bfde_d55a3978c294.slice/crio-5da3b46e51366cf2d2d0bfc675fe83f681f598c4378c0b47c12778c7ffdb5665 WatchSource:0}: Error finding container 5da3b46e51366cf2d2d0bfc675fe83f681f598c4378c0b47c12778c7ffdb5665: Status 404 returned error can't find the container with id 5da3b46e51366cf2d2d0bfc675fe83f681f598c4378c0b47c12778c7ffdb5665 Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.533755 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c4cb4cdd5-hjjlp"] Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.622707 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dpk2w"] Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.644913 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.667316 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c4cb4cdd5-hjjlp"] Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.699367 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64bb4dc8df-rxs4r"] Oct 03 18:32:09 crc kubenswrapper[4835]: E1003 18:32:09.699728 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94622de9-5048-41be-875b-dc37acc7eba4" containerName="glance-db-sync" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.699779 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="94622de9-5048-41be-875b-dc37acc7eba4" containerName="glance-db-sync" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.700012 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="94622de9-5048-41be-875b-dc37acc7eba4" containerName="glance-db-sync" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.717160 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xkgv4"] Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.719292 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.729147 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64bb4dc8df-rxs4r"] Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.836813 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.837329 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-config\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.837357 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9jnr\" (UniqueName: \"kubernetes.io/projected/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-kube-api-access-x9jnr\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.837413 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-dns-svc\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.837473 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-ovsdbserver-nb\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.837515 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-dns-swift-storage-0\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.837539 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-ovsdbserver-sb\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.939202 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-ovsdbserver-nb\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.939287 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-dns-swift-storage-0\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.939314 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-ovsdbserver-sb\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.939342 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-config\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.939363 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9jnr\" (UniqueName: \"kubernetes.io/projected/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-kube-api-access-x9jnr\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.939416 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-dns-svc\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.940551 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-dns-svc\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.941333 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-ovsdbserver-nb\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.942029 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-ovsdbserver-sb\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.943179 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-config\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.943302 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-dns-swift-storage-0\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.964961 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fz6rs"] Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.973547 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9jnr\" (UniqueName: \"kubernetes.io/projected/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-kube-api-access-x9jnr\") pod \"dnsmasq-dns-64bb4dc8df-rxs4r\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:09 crc kubenswrapper[4835]: I1003 18:32:09.977585 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57778c7597-llbt4"] Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.011954 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.060219 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57778c7597-llbt4"] Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.075758 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.081796 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-85456c75b5-fc7vc"] Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.092993 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.124315 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.151427 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.249963 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85456c75b5-fc7vc"] Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.251828 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plrmv\" (UniqueName: \"kubernetes.io/projected/cf560b83-49ed-48b6-b61f-d59166907390-kube-api-access-plrmv\") pod \"cf560b83-49ed-48b6-b61f-d59166907390\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.251879 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-ovsdbserver-sb\") pod \"cf560b83-49ed-48b6-b61f-d59166907390\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.251971 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-dns-swift-storage-0\") pod \"cf560b83-49ed-48b6-b61f-d59166907390\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.251994 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-ovsdbserver-nb\") pod \"cf560b83-49ed-48b6-b61f-d59166907390\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.252021 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-config\") pod \"cf560b83-49ed-48b6-b61f-d59166907390\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.252105 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-dns-svc\") pod \"cf560b83-49ed-48b6-b61f-d59166907390\" (UID: \"cf560b83-49ed-48b6-b61f-d59166907390\") " Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.252299 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-logs\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.252370 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-horizon-secret-key\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.252399 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmrfl\" (UniqueName: \"kubernetes.io/projected/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-kube-api-access-xmrfl\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.253153 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-scripts\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.262370 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf560b83-49ed-48b6-b61f-d59166907390-kube-api-access-plrmv" (OuterVolumeSpecName: "kube-api-access-plrmv") pod "cf560b83-49ed-48b6-b61f-d59166907390" (UID: "cf560b83-49ed-48b6-b61f-d59166907390"). InnerVolumeSpecName "kube-api-access-plrmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.270511 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-config-data\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.270726 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plrmv\" (UniqueName: \"kubernetes.io/projected/cf560b83-49ed-48b6-b61f-d59166907390-kube-api-access-plrmv\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.279580 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:32:10 crc kubenswrapper[4835]: E1003 18:32:10.280021 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf560b83-49ed-48b6-b61f-d59166907390" containerName="init" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.280046 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf560b83-49ed-48b6-b61f-d59166907390" containerName="init" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.280328 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf560b83-49ed-48b6-b61f-d59166907390" containerName="init" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.283876 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.287162 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.321762 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.322114 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.323411 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fz6rs" event={"ID":"709d622b-7993-4d18-8185-10b4f1c81d79","Type":"ContainerStarted","Data":"e2fd3dc327cf9dd3fa12882bcdd92f5012261969d1f0901eb8e388812eb47729"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.323986 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-z6gzn" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.327481 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf560b83-49ed-48b6-b61f-d59166907390" (UID: "cf560b83-49ed-48b6-b61f-d59166907390"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.329131 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cf560b83-49ed-48b6-b61f-d59166907390" (UID: "cf560b83-49ed-48b6-b61f-d59166907390"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.333331 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59ddd97667-jf4l4" event={"ID":"fd72585e-bb84-41a5-bfde-d55a3978c294","Type":"ContainerStarted","Data":"5da3b46e51366cf2d2d0bfc675fe83f681f598c4378c0b47c12778c7ffdb5665"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.334795 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf560b83-49ed-48b6-b61f-d59166907390" (UID: "cf560b83-49ed-48b6-b61f-d59166907390"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.341308 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf560b83-49ed-48b6-b61f-d59166907390" (UID: "cf560b83-49ed-48b6-b61f-d59166907390"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.362042 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-config" (OuterVolumeSpecName: "config") pod "cf560b83-49ed-48b6-b61f-d59166907390" (UID: "cf560b83-49ed-48b6-b61f-d59166907390"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.367826 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f9k44" event={"ID":"2cb51f11-d7f5-46ed-825f-6ca8c530094b","Type":"ContainerStarted","Data":"96b91c11670e6c312658d7ec0de4f3048c8396ee891f3797f4b740ebe3682a56"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.378012 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-scripts\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.378419 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp525\" (UniqueName: \"kubernetes.io/projected/cc75da7c-52a7-421a-b04f-e9269a316a2e-kube-api-access-hp525\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.378500 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-config-data\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.378565 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-logs\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.378643 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.378760 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-horizon-secret-key\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.380358 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc75da7c-52a7-421a-b04f-e9269a316a2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.380462 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmrfl\" (UniqueName: \"kubernetes.io/projected/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-kube-api-access-xmrfl\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.380536 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.380649 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.380731 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.380815 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc75da7c-52a7-421a-b04f-e9269a316a2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.380960 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.381021 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.381089 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.379111 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-scripts\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.379779 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-logs\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.381158 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.381232 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf560b83-49ed-48b6-b61f-d59166907390-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.379969 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-config-data\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.391689 4835 generic.go:334] "Generic (PLEG): container finished" podID="87bc8f1a-de34-4448-bc20-10e5b92907e6" containerID="341b0758340c4d6c6c5b16d296fefbaf5bbab98888255cd0a922ee3e3471be20" exitCode=0 Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.391822 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c3bd-account-create-cvkm5" event={"ID":"87bc8f1a-de34-4448-bc20-10e5b92907e6","Type":"ContainerDied","Data":"341b0758340c4d6c6c5b16d296fefbaf5bbab98888255cd0a922ee3e3471be20"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.393828 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-horizon-secret-key\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.404147 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.406460 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmrfl\" (UniqueName: \"kubernetes.io/projected/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-kube-api-access-xmrfl\") pod \"horizon-85456c75b5-fc7vc\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.412050 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.414276 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" event={"ID":"6689e7e5-6421-4939-b68b-d93c54479b72","Type":"ContainerStarted","Data":"73f4db1f0a25cf25f27f4f580f4c4110d17577109e19a90002c2511c90bc1b9a"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.414300 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" podUID="6689e7e5-6421-4939-b68b-d93c54479b72" containerName="init" containerID="cri-o://91efb08f470610fa331d51decc2970eb96f6832cd62d2cd671c0b48192e0deaa" gracePeriod=10 Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.415257 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.417155 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.435705 4835 generic.go:334] "Generic (PLEG): container finished" podID="cf560b83-49ed-48b6-b61f-d59166907390" containerID="2f0830d1e26fbae5c67dd32c8c6bc93a219e35e8042df43e05f20af0d8134982" exitCode=0 Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.435796 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" event={"ID":"cf560b83-49ed-48b6-b61f-d59166907390","Type":"ContainerDied","Data":"2f0830d1e26fbae5c67dd32c8c6bc93a219e35e8042df43e05f20af0d8134982"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.435821 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" event={"ID":"cf560b83-49ed-48b6-b61f-d59166907390","Type":"ContainerDied","Data":"11238667055d249574d07a3b1dc56eaea70697ad83a560846863a8b1e9afc40c"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.435838 4835 scope.go:117] "RemoveContainer" containerID="2f0830d1e26fbae5c67dd32c8c6bc93a219e35e8042df43e05f20af0d8134982" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.436053 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c45f94f-2g4gf" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.442152 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.443052 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-f9k44" podStartSLOduration=3.443034873 podStartE2EDuration="3.443034873s" podCreationTimestamp="2025-10-03 18:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:10.401147643 +0000 UTC m=+1072.117088525" watchObservedRunningTime="2025-10-03 18:32:10.443034873 +0000 UTC m=+1072.158975745" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.443319 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.463441 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="1b461fe4-edb9-423d-b17e-7cb251f7fc0d" containerName="watcher-api-log" containerID="cri-o://d9c8616fe8126b38220803ebb361899f50433e63e4da73fc8c7800beed22c9e0" gracePeriod=30 Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.462846 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"1b461fe4-edb9-423d-b17e-7cb251f7fc0d","Type":"ContainerStarted","Data":"c195933c0ae0d96122002799414129f837d005b33823bc491e94f174a16a025e"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.463566 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.463579 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"1b461fe4-edb9-423d-b17e-7cb251f7fc0d","Type":"ContainerStarted","Data":"d9c8616fe8126b38220803ebb361899f50433e63e4da73fc8c7800beed22c9e0"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.463644 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="1b461fe4-edb9-423d-b17e-7cb251f7fc0d" containerName="watcher-api" containerID="cri-o://c195933c0ae0d96122002799414129f837d005b33823bc491e94f174a16a025e" gracePeriod=30 Oct 03 18:32:10 crc kubenswrapper[4835]: W1003 18:32:10.484602 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf75f57a_7612_48c8_b3fb_8594e81e2d0a.slice/crio-8acfdc035a7402aa4159aedbcb018a19baacd856fa0e00bc4fe893c372ffeeaf WatchSource:0}: Error finding container 8acfdc035a7402aa4159aedbcb018a19baacd856fa0e00bc4fe893c372ffeeaf: Status 404 returned error can't find the container with id 8acfdc035a7402aa4159aedbcb018a19baacd856fa0e00bc4fe893c372ffeeaf Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.485777 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.485853 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc75da7c-52a7-421a-b04f-e9269a316a2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.485889 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.485914 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-logs\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.485938 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-config-data\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.485982 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.486007 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.486027 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc75da7c-52a7-421a-b04f-e9269a316a2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.486081 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnzzz\" (UniqueName: \"kubernetes.io/projected/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-kube-api-access-vnzzz\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.486101 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-scripts\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.486131 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.486153 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp525\" (UniqueName: \"kubernetes.io/projected/cc75da7c-52a7-421a-b04f-e9269a316a2e-kube-api-access-hp525\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.486182 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.486200 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.489295 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc75da7c-52a7-421a-b04f-e9269a316a2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.490089 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.490497 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc75da7c-52a7-421a-b04f-e9269a316a2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.492918 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.500975 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dpk2w" event={"ID":"705966b1-0d0b-4c12-9cc1-830277fcf80c","Type":"ContainerStarted","Data":"9b60c38e579d20045d1c1ef8eb393212f20806911c820f169e411a415e656eb0"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.506652 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.506653 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57778c7597-llbt4" event={"ID":"5c7bc6f5-2d6b-475e-b294-9141ee21ceac","Type":"ContainerStarted","Data":"ffdff434e9b619b31363616e833164b096022fe0345b86de98922460713ae118"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.521282 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="1b461fe4-edb9-423d-b17e-7cb251f7fc0d" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.151:9322/\": dial tcp 10.217.0.151:9322: connect: connection refused" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.526858 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.527101 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp525\" (UniqueName: \"kubernetes.io/projected/cc75da7c-52a7-421a-b04f-e9269a316a2e-kube-api-access-hp525\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.538304 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xkgv4" event={"ID":"ddee52bd-4539-46b6-a51f-50fe9278665a","Type":"ContainerStarted","Data":"347d75c5230270d1f8503c10e9e0071b51c556f0b0ca6529f95bf6a95b9369a1"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.564342 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"4870398d-de86-4dc6-9052-b6e80bfe27f5","Type":"ContainerStarted","Data":"00607928be178c9b851aac8640dc7b476c26b3c8a837e8f0b9a0b621cdc4118f"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.591531 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-logs\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.591577 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-config-data\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.591643 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnzzz\" (UniqueName: \"kubernetes.io/projected/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-kube-api-access-vnzzz\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.591663 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-scripts\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.591708 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.591742 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.591778 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.591928 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.597321 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8110d0e5-9e19-4306-b8aa-babe937e8d2a","Type":"ContainerStarted","Data":"a65f3c32097efdcccd2d76df11f8f8af766a5fe3f2b6de5c70d7fbdf3bbbf7aa"} Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.597745 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-logs\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.597781 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.599360 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-config-data\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.618492 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-scripts\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.634115 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.644787 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnzzz\" (UniqueName: \"kubernetes.io/projected/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-kube-api-access-vnzzz\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.677878 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.720268 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.720245718 podStartE2EDuration="3.720245718s" podCreationTimestamp="2025-10-03 18:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:10.505238192 +0000 UTC m=+1072.221179084" watchObservedRunningTime="2025-10-03 18:32:10.720245718 +0000 UTC m=+1072.436186590" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.740341 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.744299 4835 scope.go:117] "RemoveContainer" containerID="2f0830d1e26fbae5c67dd32c8c6bc93a219e35e8042df43e05f20af0d8134982" Oct 03 18:32:10 crc kubenswrapper[4835]: E1003 18:32:10.747699 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0830d1e26fbae5c67dd32c8c6bc93a219e35e8042df43e05f20af0d8134982\": container with ID starting with 2f0830d1e26fbae5c67dd32c8c6bc93a219e35e8042df43e05f20af0d8134982 not found: ID does not exist" containerID="2f0830d1e26fbae5c67dd32c8c6bc93a219e35e8042df43e05f20af0d8134982" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.747738 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0830d1e26fbae5c67dd32c8c6bc93a219e35e8042df43e05f20af0d8134982"} err="failed to get container status \"2f0830d1e26fbae5c67dd32c8c6bc93a219e35e8042df43e05f20af0d8134982\": rpc error: code = NotFound desc = could not find container \"2f0830d1e26fbae5c67dd32c8c6bc93a219e35e8042df43e05f20af0d8134982\": container with ID starting with 2f0830d1e26fbae5c67dd32c8c6bc93a219e35e8042df43e05f20af0d8134982 not found: ID does not exist" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.789852 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c45f94f-2g4gf"] Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.819275 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c4c45f94f-2g4gf"] Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.896052 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf560b83-49ed-48b6-b61f-d59166907390" path="/var/lib/kubelet/pods/cf560b83-49ed-48b6-b61f-d59166907390/volumes" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.957038 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:10 crc kubenswrapper[4835]: I1003 18:32:10.980059 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64bb4dc8df-rxs4r"] Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.043003 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.063278 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.201399 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-ovsdbserver-sb\") pod \"6689e7e5-6421-4939-b68b-d93c54479b72\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.201732 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb4xd\" (UniqueName: \"kubernetes.io/projected/6689e7e5-6421-4939-b68b-d93c54479b72-kube-api-access-rb4xd\") pod \"6689e7e5-6421-4939-b68b-d93c54479b72\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.201766 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-dns-svc\") pod \"6689e7e5-6421-4939-b68b-d93c54479b72\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.201994 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-config\") pod \"6689e7e5-6421-4939-b68b-d93c54479b72\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.202035 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-dns-swift-storage-0\") pod \"6689e7e5-6421-4939-b68b-d93c54479b72\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.202057 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-ovsdbserver-nb\") pod \"6689e7e5-6421-4939-b68b-d93c54479b72\" (UID: \"6689e7e5-6421-4939-b68b-d93c54479b72\") " Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.218446 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6689e7e5-6421-4939-b68b-d93c54479b72-kube-api-access-rb4xd" (OuterVolumeSpecName: "kube-api-access-rb4xd") pod "6689e7e5-6421-4939-b68b-d93c54479b72" (UID: "6689e7e5-6421-4939-b68b-d93c54479b72"). InnerVolumeSpecName "kube-api-access-rb4xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.230543 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6689e7e5-6421-4939-b68b-d93c54479b72" (UID: "6689e7e5-6421-4939-b68b-d93c54479b72"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.237348 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-config" (OuterVolumeSpecName: "config") pod "6689e7e5-6421-4939-b68b-d93c54479b72" (UID: "6689e7e5-6421-4939-b68b-d93c54479b72"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.245509 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6689e7e5-6421-4939-b68b-d93c54479b72" (UID: "6689e7e5-6421-4939-b68b-d93c54479b72"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.251398 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6689e7e5-6421-4939-b68b-d93c54479b72" (UID: "6689e7e5-6421-4939-b68b-d93c54479b72"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.263553 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6689e7e5-6421-4939-b68b-d93c54479b72" (UID: "6689e7e5-6421-4939-b68b-d93c54479b72"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.306252 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.306287 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.306301 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.306311 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.306329 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb4xd\" (UniqueName: \"kubernetes.io/projected/6689e7e5-6421-4939-b68b-d93c54479b72-kube-api-access-rb4xd\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.306338 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6689e7e5-6421-4939-b68b-d93c54479b72-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.319428 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85456c75b5-fc7vc"] Oct 03 18:32:11 crc kubenswrapper[4835]: W1003 18:32:11.329990 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe7bd51c_f0e6_4059_9228_b64d3c39b6b8.slice/crio-1617842e8b4f29ffa937b8e15ce03375c4b6dc4f6b9942c62b8d4d4068d08c5b WatchSource:0}: Error finding container 1617842e8b4f29ffa937b8e15ce03375c4b6dc4f6b9942c62b8d4d4068d08c5b: Status 404 returned error can't find the container with id 1617842e8b4f29ffa937b8e15ce03375c4b6dc4f6b9942c62b8d4d4068d08c5b Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.617263 4835 generic.go:334] "Generic (PLEG): container finished" podID="6689e7e5-6421-4939-b68b-d93c54479b72" containerID="91efb08f470610fa331d51decc2970eb96f6832cd62d2cd671c0b48192e0deaa" exitCode=0 Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.617572 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" event={"ID":"6689e7e5-6421-4939-b68b-d93c54479b72","Type":"ContainerDied","Data":"73f4db1f0a25cf25f27f4f580f4c4110d17577109e19a90002c2511c90bc1b9a"} Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.617597 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" event={"ID":"6689e7e5-6421-4939-b68b-d93c54479b72","Type":"ContainerDied","Data":"91efb08f470610fa331d51decc2970eb96f6832cd62d2cd671c0b48192e0deaa"} Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.617612 4835 scope.go:117] "RemoveContainer" containerID="91efb08f470610fa331d51decc2970eb96f6832cd62d2cd671c0b48192e0deaa" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.617688 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4cb4cdd5-hjjlp" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.666511 4835 scope.go:117] "RemoveContainer" containerID="91efb08f470610fa331d51decc2970eb96f6832cd62d2cd671c0b48192e0deaa" Oct 03 18:32:11 crc kubenswrapper[4835]: E1003 18:32:11.684200 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91efb08f470610fa331d51decc2970eb96f6832cd62d2cd671c0b48192e0deaa\": container with ID starting with 91efb08f470610fa331d51decc2970eb96f6832cd62d2cd671c0b48192e0deaa not found: ID does not exist" containerID="91efb08f470610fa331d51decc2970eb96f6832cd62d2cd671c0b48192e0deaa" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.684253 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91efb08f470610fa331d51decc2970eb96f6832cd62d2cd671c0b48192e0deaa"} err="failed to get container status \"91efb08f470610fa331d51decc2970eb96f6832cd62d2cd671c0b48192e0deaa\": rpc error: code = NotFound desc = could not find container \"91efb08f470610fa331d51decc2970eb96f6832cd62d2cd671c0b48192e0deaa\": container with ID starting with 91efb08f470610fa331d51decc2970eb96f6832cd62d2cd671c0b48192e0deaa not found: ID does not exist" Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.684691 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85456c75b5-fc7vc" event={"ID":"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8","Type":"ContainerStarted","Data":"1617842e8b4f29ffa937b8e15ce03375c4b6dc4f6b9942c62b8d4d4068d08c5b"} Oct 03 18:32:11 crc kubenswrapper[4835]: W1003 18:32:11.698038 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc75da7c_52a7_421a_b04f_e9269a316a2e.slice/crio-108504af6d453dc22b521f4538aba32ad686097d494a94e1e029f40928fc5761 WatchSource:0}: Error finding container 108504af6d453dc22b521f4538aba32ad686097d494a94e1e029f40928fc5761: Status 404 returned error can't find the container with id 108504af6d453dc22b521f4538aba32ad686097d494a94e1e029f40928fc5761 Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.698814 4835 generic.go:334] "Generic (PLEG): container finished" podID="1b461fe4-edb9-423d-b17e-7cb251f7fc0d" containerID="d9c8616fe8126b38220803ebb361899f50433e63e4da73fc8c7800beed22c9e0" exitCode=143 Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.698895 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"1b461fe4-edb9-423d-b17e-7cb251f7fc0d","Type":"ContainerDied","Data":"d9c8616fe8126b38220803ebb361899f50433e63e4da73fc8c7800beed22c9e0"} Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.714502 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af75f57a-7612-48c8-b3fb-8594e81e2d0a","Type":"ContainerStarted","Data":"8acfdc035a7402aa4159aedbcb018a19baacd856fa0e00bc4fe893c372ffeeaf"} Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.730352 4835 generic.go:334] "Generic (PLEG): container finished" podID="9ed367e4-c09b-46d6-82d0-f43eb6c4417d" containerID="9f2a2f8c71eed05b1a59bad13872c4d4edff6fed510baf8713ceb93e74c0fbb5" exitCode=0 Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.732747 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" event={"ID":"9ed367e4-c09b-46d6-82d0-f43eb6c4417d","Type":"ContainerDied","Data":"9f2a2f8c71eed05b1a59bad13872c4d4edff6fed510baf8713ceb93e74c0fbb5"} Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.732801 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" event={"ID":"9ed367e4-c09b-46d6-82d0-f43eb6c4417d","Type":"ContainerStarted","Data":"fcb0c806b985135c3ee20910e7b7d832f0edcf3d32755b4a74cd0dd7134c7936"} Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.751002 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.794717 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c4cb4cdd5-hjjlp"] Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.805353 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c4cb4cdd5-hjjlp"] Oct 03 18:32:11 crc kubenswrapper[4835]: I1003 18:32:11.872032 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:32:12 crc kubenswrapper[4835]: W1003 18:32:12.405676 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13fd9ca_4ac9_4a8f_a3fa_e9efed841985.slice/crio-453dc01bb715ba71aea2848cf78fd3202d74d999c1f54302122bbe03dc136565 WatchSource:0}: Error finding container 453dc01bb715ba71aea2848cf78fd3202d74d999c1f54302122bbe03dc136565: Status 404 returned error can't find the container with id 453dc01bb715ba71aea2848cf78fd3202d74d999c1f54302122bbe03dc136565 Oct 03 18:32:12 crc kubenswrapper[4835]: I1003 18:32:12.523800 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c3bd-account-create-cvkm5" Oct 03 18:32:12 crc kubenswrapper[4835]: I1003 18:32:12.642566 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8qkm\" (UniqueName: \"kubernetes.io/projected/87bc8f1a-de34-4448-bc20-10e5b92907e6-kube-api-access-j8qkm\") pod \"87bc8f1a-de34-4448-bc20-10e5b92907e6\" (UID: \"87bc8f1a-de34-4448-bc20-10e5b92907e6\") " Oct 03 18:32:12 crc kubenswrapper[4835]: I1003 18:32:12.661453 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87bc8f1a-de34-4448-bc20-10e5b92907e6-kube-api-access-j8qkm" (OuterVolumeSpecName: "kube-api-access-j8qkm") pod "87bc8f1a-de34-4448-bc20-10e5b92907e6" (UID: "87bc8f1a-de34-4448-bc20-10e5b92907e6"). InnerVolumeSpecName "kube-api-access-j8qkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:12 crc kubenswrapper[4835]: I1003 18:32:12.744335 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8qkm\" (UniqueName: \"kubernetes.io/projected/87bc8f1a-de34-4448-bc20-10e5b92907e6-kube-api-access-j8qkm\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:12 crc kubenswrapper[4835]: I1003 18:32:12.765705 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cc75da7c-52a7-421a-b04f-e9269a316a2e","Type":"ContainerStarted","Data":"108504af6d453dc22b521f4538aba32ad686097d494a94e1e029f40928fc5761"} Oct 03 18:32:12 crc kubenswrapper[4835]: I1003 18:32:12.766960 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985","Type":"ContainerStarted","Data":"453dc01bb715ba71aea2848cf78fd3202d74d999c1f54302122bbe03dc136565"} Oct 03 18:32:12 crc kubenswrapper[4835]: I1003 18:32:12.768091 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c3bd-account-create-cvkm5" event={"ID":"87bc8f1a-de34-4448-bc20-10e5b92907e6","Type":"ContainerDied","Data":"554172fc7ed0f185a4e38fef3d02523f085004c60a5f09b9170b8cbe8025de2d"} Oct 03 18:32:12 crc kubenswrapper[4835]: I1003 18:32:12.768112 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="554172fc7ed0f185a4e38fef3d02523f085004c60a5f09b9170b8cbe8025de2d" Oct 03 18:32:12 crc kubenswrapper[4835]: I1003 18:32:12.768167 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c3bd-account-create-cvkm5" Oct 03 18:32:12 crc kubenswrapper[4835]: I1003 18:32:12.913740 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6689e7e5-6421-4939-b68b-d93c54479b72" path="/var/lib/kubelet/pods/6689e7e5-6421-4939-b68b-d93c54479b72/volumes" Oct 03 18:32:13 crc kubenswrapper[4835]: I1003 18:32:13.124388 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 03 18:32:15 crc kubenswrapper[4835]: I1003 18:32:15.301961 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 03 18:32:15 crc kubenswrapper[4835]: I1003 18:32:15.828128 4835 generic.go:334] "Generic (PLEG): container finished" podID="2cb51f11-d7f5-46ed-825f-6ca8c530094b" containerID="96b91c11670e6c312658d7ec0de4f3048c8396ee891f3797f4b740ebe3682a56" exitCode=0 Oct 03 18:32:15 crc kubenswrapper[4835]: I1003 18:32:15.828243 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f9k44" event={"ID":"2cb51f11-d7f5-46ed-825f-6ca8c530094b","Type":"ContainerDied","Data":"96b91c11670e6c312658d7ec0de4f3048c8396ee891f3797f4b740ebe3682a56"} Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.395647 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.540173 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-combined-ca-bundle\") pod \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.540230 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26qgm\" (UniqueName: \"kubernetes.io/projected/2cb51f11-d7f5-46ed-825f-6ca8c530094b-kube-api-access-26qgm\") pod \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.540332 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-fernet-keys\") pod \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.540374 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-scripts\") pod \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.540510 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-config-data\") pod \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.540536 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-credential-keys\") pod \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\" (UID: \"2cb51f11-d7f5-46ed-825f-6ca8c530094b\") " Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.555197 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-scripts" (OuterVolumeSpecName: "scripts") pod "2cb51f11-d7f5-46ed-825f-6ca8c530094b" (UID: "2cb51f11-d7f5-46ed-825f-6ca8c530094b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.558087 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2cb51f11-d7f5-46ed-825f-6ca8c530094b" (UID: "2cb51f11-d7f5-46ed-825f-6ca8c530094b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.566497 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2cb51f11-d7f5-46ed-825f-6ca8c530094b" (UID: "2cb51f11-d7f5-46ed-825f-6ca8c530094b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.568330 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb51f11-d7f5-46ed-825f-6ca8c530094b-kube-api-access-26qgm" (OuterVolumeSpecName: "kube-api-access-26qgm") pod "2cb51f11-d7f5-46ed-825f-6ca8c530094b" (UID: "2cb51f11-d7f5-46ed-825f-6ca8c530094b"). InnerVolumeSpecName "kube-api-access-26qgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.583740 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cb51f11-d7f5-46ed-825f-6ca8c530094b" (UID: "2cb51f11-d7f5-46ed-825f-6ca8c530094b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.595293 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-config-data" (OuterVolumeSpecName: "config-data") pod "2cb51f11-d7f5-46ed-825f-6ca8c530094b" (UID: "2cb51f11-d7f5-46ed-825f-6ca8c530094b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.606087 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-d274k"] Oct 03 18:32:17 crc kubenswrapper[4835]: E1003 18:32:17.606575 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6689e7e5-6421-4939-b68b-d93c54479b72" containerName="init" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.606594 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6689e7e5-6421-4939-b68b-d93c54479b72" containerName="init" Oct 03 18:32:17 crc kubenswrapper[4835]: E1003 18:32:17.606609 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87bc8f1a-de34-4448-bc20-10e5b92907e6" containerName="mariadb-account-create" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.606616 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="87bc8f1a-de34-4448-bc20-10e5b92907e6" containerName="mariadb-account-create" Oct 03 18:32:17 crc kubenswrapper[4835]: E1003 18:32:17.606630 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb51f11-d7f5-46ed-825f-6ca8c530094b" containerName="keystone-bootstrap" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.606636 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb51f11-d7f5-46ed-825f-6ca8c530094b" containerName="keystone-bootstrap" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.606827 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="87bc8f1a-de34-4448-bc20-10e5b92907e6" containerName="mariadb-account-create" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.606845 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb51f11-d7f5-46ed-825f-6ca8c530094b" containerName="keystone-bootstrap" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.606853 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6689e7e5-6421-4939-b68b-d93c54479b72" containerName="init" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.607506 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d274k" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.609917 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k4s6x" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.610089 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.610206 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.630940 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-d274k"] Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.642736 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.642766 4835 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.642775 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.642785 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26qgm\" (UniqueName: \"kubernetes.io/projected/2cb51f11-d7f5-46ed-825f-6ca8c530094b-kube-api-access-26qgm\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.642794 4835 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.642802 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cb51f11-d7f5-46ed-825f-6ca8c530094b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.744551 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbh4l\" (UniqueName: \"kubernetes.io/projected/4f142f3b-9cce-451e-82b0-bfdac3ec661c-kube-api-access-dbh4l\") pod \"neutron-db-sync-d274k\" (UID: \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\") " pod="openstack/neutron-db-sync-d274k" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.744595 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f142f3b-9cce-451e-82b0-bfdac3ec661c-combined-ca-bundle\") pod \"neutron-db-sync-d274k\" (UID: \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\") " pod="openstack/neutron-db-sync-d274k" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.744771 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f142f3b-9cce-451e-82b0-bfdac3ec661c-config\") pod \"neutron-db-sync-d274k\" (UID: \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\") " pod="openstack/neutron-db-sync-d274k" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.845178 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f9k44" event={"ID":"2cb51f11-d7f5-46ed-825f-6ca8c530094b","Type":"ContainerDied","Data":"e9fda921116b2d41f9a2532bcbe6cc6fe8ed9b7659457e530a2cfe02590efd0e"} Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.845214 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9fda921116b2d41f9a2532bcbe6cc6fe8ed9b7659457e530a2cfe02590efd0e" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.845231 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f9k44" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.845758 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f142f3b-9cce-451e-82b0-bfdac3ec661c-config\") pod \"neutron-db-sync-d274k\" (UID: \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\") " pod="openstack/neutron-db-sync-d274k" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.845824 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbh4l\" (UniqueName: \"kubernetes.io/projected/4f142f3b-9cce-451e-82b0-bfdac3ec661c-kube-api-access-dbh4l\") pod \"neutron-db-sync-d274k\" (UID: \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\") " pod="openstack/neutron-db-sync-d274k" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.845853 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f142f3b-9cce-451e-82b0-bfdac3ec661c-combined-ca-bundle\") pod \"neutron-db-sync-d274k\" (UID: \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\") " pod="openstack/neutron-db-sync-d274k" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.850217 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f142f3b-9cce-451e-82b0-bfdac3ec661c-config\") pod \"neutron-db-sync-d274k\" (UID: \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\") " pod="openstack/neutron-db-sync-d274k" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.850651 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f142f3b-9cce-451e-82b0-bfdac3ec661c-combined-ca-bundle\") pod \"neutron-db-sync-d274k\" (UID: \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\") " pod="openstack/neutron-db-sync-d274k" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.872680 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbh4l\" (UniqueName: \"kubernetes.io/projected/4f142f3b-9cce-451e-82b0-bfdac3ec661c-kube-api-access-dbh4l\") pod \"neutron-db-sync-d274k\" (UID: \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\") " pod="openstack/neutron-db-sync-d274k" Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.942004 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-f9k44"] Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.950392 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-f9k44"] Oct 03 18:32:17 crc kubenswrapper[4835]: I1003 18:32:17.960312 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d274k" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.028788 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-t2cnf"] Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.030939 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.034802 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vd8gf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.034908 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.035346 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.036111 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.044251 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t2cnf"] Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.151179 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7gpr\" (UniqueName: \"kubernetes.io/projected/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-kube-api-access-b7gpr\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.151232 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-fernet-keys\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.151276 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-combined-ca-bundle\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.151423 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-scripts\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.151746 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-config-data\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.151798 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-credential-keys\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.253955 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-fernet-keys\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.254038 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-combined-ca-bundle\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.254128 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-scripts\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.254240 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-config-data\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.254272 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-credential-keys\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.254302 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7gpr\" (UniqueName: \"kubernetes.io/projected/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-kube-api-access-b7gpr\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.258943 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-scripts\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.259725 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-fernet-keys\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.261654 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-credential-keys\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.262801 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-combined-ca-bundle\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.264425 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-config-data\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.291693 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7gpr\" (UniqueName: \"kubernetes.io/projected/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-kube-api-access-b7gpr\") pod \"keystone-bootstrap-t2cnf\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.362202 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:18 crc kubenswrapper[4835]: I1003 18:32:18.895032 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb51f11-d7f5-46ed-825f-6ca8c530094b" path="/var/lib/kubelet/pods/2cb51f11-d7f5-46ed-825f-6ca8c530094b/volumes" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.097729 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.149737 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.372033 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59ddd97667-jf4l4"] Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.406803 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64dcfd48b6-tpcpd"] Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.408320 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.420308 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.432647 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64dcfd48b6-tpcpd"] Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.472799 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85456c75b5-fc7vc"] Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.482737 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-84859df966-b4t26"] Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.484190 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.504385 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-horizon-tls-certs\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.504467 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-horizon-secret-key\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.504503 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de5d465a-f009-4cef-940e-3b2aaa64468b-scripts\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.504522 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de5d465a-f009-4cef-940e-3b2aaa64468b-logs\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.504564 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-combined-ca-bundle\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.505453 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twmrd\" (UniqueName: \"kubernetes.io/projected/de5d465a-f009-4cef-940e-3b2aaa64468b-kube-api-access-twmrd\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.505511 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de5d465a-f009-4cef-940e-3b2aaa64468b-config-data\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.517088 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84859df966-b4t26"] Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.608605 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de5d465a-f009-4cef-940e-3b2aaa64468b-scripts\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.608641 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de5d465a-f009-4cef-940e-3b2aaa64468b-logs\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.608675 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-combined-ca-bundle\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.608705 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-logs\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.608731 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-scripts\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.608747 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-combined-ca-bundle\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.608763 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twmrd\" (UniqueName: \"kubernetes.io/projected/de5d465a-f009-4cef-940e-3b2aaa64468b-kube-api-access-twmrd\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.608782 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de5d465a-f009-4cef-940e-3b2aaa64468b-config-data\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.608799 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-horizon-secret-key\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.608817 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzn9j\" (UniqueName: \"kubernetes.io/projected/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-kube-api-access-bzn9j\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.608871 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-horizon-tls-certs\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.609635 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de5d465a-f009-4cef-940e-3b2aaa64468b-logs\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.609643 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de5d465a-f009-4cef-940e-3b2aaa64468b-scripts\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.609851 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-horizon-tls-certs\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.609929 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-horizon-secret-key\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.609973 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-config-data\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.610363 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de5d465a-f009-4cef-940e-3b2aaa64468b-config-data\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.615697 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-horizon-tls-certs\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.619186 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-horizon-secret-key\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.627240 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twmrd\" (UniqueName: \"kubernetes.io/projected/de5d465a-f009-4cef-940e-3b2aaa64468b-kube-api-access-twmrd\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.628737 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-combined-ca-bundle\") pod \"horizon-64dcfd48b6-tpcpd\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.711674 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-combined-ca-bundle\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.711754 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-logs\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.711796 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-scripts\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.711829 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-horizon-secret-key\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.711860 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzn9j\" (UniqueName: \"kubernetes.io/projected/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-kube-api-access-bzn9j\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.711976 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-horizon-tls-certs\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.712026 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-config-data\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.712342 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-logs\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.713056 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-scripts\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.713642 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-config-data\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.714844 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-combined-ca-bundle\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.714944 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-horizon-tls-certs\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.718381 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-horizon-secret-key\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.732750 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzn9j\" (UniqueName: \"kubernetes.io/projected/dbf2013d-5dc5-4fe6-a408-08757b74ecc8-kube-api-access-bzn9j\") pod \"horizon-84859df966-b4t26\" (UID: \"dbf2013d-5dc5-4fe6-a408-08757b74ecc8\") " pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.740515 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:20 crc kubenswrapper[4835]: I1003 18:32:20.813595 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:23 crc kubenswrapper[4835]: E1003 18:32:23.821632 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Oct 03 18:32:23 crc kubenswrapper[4835]: E1003 18:32:23.822238 4835 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Oct 03 18:32:23 crc kubenswrapper[4835]: E1003 18:32:23.822383 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:38.102.83.82:5001/podified-master-centos10/openstack-placement-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bb5zf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-xkgv4_openstack(ddee52bd-4539-46b6-a51f-50fe9278665a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 18:32:23 crc kubenswrapper[4835]: E1003 18:32:23.823749 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-xkgv4" podUID="ddee52bd-4539-46b6-a51f-50fe9278665a" Oct 03 18:32:23 crc kubenswrapper[4835]: E1003 18:32:23.903449 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.82:5001/podified-master-centos10/openstack-placement-api:watcher_latest\\\"\"" pod="openstack/placement-db-sync-xkgv4" podUID="ddee52bd-4539-46b6-a51f-50fe9278665a" Oct 03 18:32:27 crc kubenswrapper[4835]: E1003 18:32:27.813484 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 03 18:32:27 crc kubenswrapper[4835]: E1003 18:32:27.814010 4835 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 03 18:32:27 crc kubenswrapper[4835]: E1003 18:32:27.814176 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.82:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5chc7h54bhb6hd4h59bh5b4h58bh596h58dh54dh5d8h664h68bh5c7h578h97hf9h66bh547h695h5ffh5d4h5h544hb5h8chd6h78hb5h9ch544q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8djxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-57778c7597-llbt4_openstack(5c7bc6f5-2d6b-475e-b294-9141ee21ceac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 18:32:27 crc kubenswrapper[4835]: E1003 18:32:27.816090 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.82:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-57778c7597-llbt4" podUID="5c7bc6f5-2d6b-475e-b294-9141ee21ceac" Oct 03 18:32:27 crc kubenswrapper[4835]: E1003 18:32:27.818791 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 03 18:32:27 crc kubenswrapper[4835]: E1003 18:32:27.818824 4835 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 03 18:32:27 crc kubenswrapper[4835]: E1003 18:32:27.818938 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.82:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68bh66bh94h68h59bhcdh7fh57bh5c9hb8hcfhbbh69h5c8h666h584h5cbhf8h7bhd4h5ffhdfh649h557h9bh699h6h9h66dhc8h5b6h568q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xmrfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-85456c75b5-fc7vc_openstack(fe7bd51c-f0e6-4059-9228-b64d3c39b6b8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 18:32:27 crc kubenswrapper[4835]: E1003 18:32:27.820749 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.82:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-85456c75b5-fc7vc" podUID="fe7bd51c-f0e6-4059-9228-b64d3c39b6b8" Oct 03 18:32:27 crc kubenswrapper[4835]: E1003 18:32:27.827935 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 03 18:32:27 crc kubenswrapper[4835]: E1003 18:32:27.827990 4835 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 03 18:32:27 crc kubenswrapper[4835]: E1003 18:32:27.828186 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.82:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf4hch55fhd6h68bh89h84h565h66dh59hc7h75h6bh5fh59fh568h5fbh5f5h66dh689h98h665h54bh589h89h56bh64fh545h79h5f7hc5h587q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wsjq6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-59ddd97667-jf4l4_openstack(fd72585e-bb84-41a5-bfde-d55a3978c294): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 18:32:27 crc kubenswrapper[4835]: E1003 18:32:27.830342 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.82:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-59ddd97667-jf4l4" podUID="fd72585e-bb84-41a5-bfde-d55a3978c294" Oct 03 18:32:35 crc kubenswrapper[4835]: I1003 18:32:35.358825 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:32:35 crc kubenswrapper[4835]: I1003 18:32:35.359267 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.888551 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.897966 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.898592 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.949855 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-config-data\") pod \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.949962 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-scripts\") pod \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950027 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd72585e-bb84-41a5-bfde-d55a3978c294-logs\") pod \"fd72585e-bb84-41a5-bfde-d55a3978c294\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950058 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsjq6\" (UniqueName: \"kubernetes.io/projected/fd72585e-bb84-41a5-bfde-d55a3978c294-kube-api-access-wsjq6\") pod \"fd72585e-bb84-41a5-bfde-d55a3978c294\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950127 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-scripts\") pod \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950204 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd72585e-bb84-41a5-bfde-d55a3978c294-config-data\") pod \"fd72585e-bb84-41a5-bfde-d55a3978c294\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950234 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd72585e-bb84-41a5-bfde-d55a3978c294-scripts\") pod \"fd72585e-bb84-41a5-bfde-d55a3978c294\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950401 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8djxn\" (UniqueName: \"kubernetes.io/projected/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-kube-api-access-8djxn\") pod \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950465 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-config-data\") pod \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950514 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd72585e-bb84-41a5-bfde-d55a3978c294-horizon-secret-key\") pod \"fd72585e-bb84-41a5-bfde-d55a3978c294\" (UID: \"fd72585e-bb84-41a5-bfde-d55a3978c294\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950528 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd72585e-bb84-41a5-bfde-d55a3978c294-logs" (OuterVolumeSpecName: "logs") pod "fd72585e-bb84-41a5-bfde-d55a3978c294" (UID: "fd72585e-bb84-41a5-bfde-d55a3978c294"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950561 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-horizon-secret-key\") pod \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950601 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-horizon-secret-key\") pod \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950655 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-logs\") pod \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950697 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-scripts" (OuterVolumeSpecName: "scripts") pod "fe7bd51c-f0e6-4059-9228-b64d3c39b6b8" (UID: "fe7bd51c-f0e6-4059-9228-b64d3c39b6b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950717 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-logs\") pod \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\" (UID: \"5c7bc6f5-2d6b-475e-b294-9141ee21ceac\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950787 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmrfl\" (UniqueName: \"kubernetes.io/projected/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-kube-api-access-xmrfl\") pod \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\" (UID: \"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8\") " Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.950785 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-scripts" (OuterVolumeSpecName: "scripts") pod "5c7bc6f5-2d6b-475e-b294-9141ee21ceac" (UID: "5c7bc6f5-2d6b-475e-b294-9141ee21ceac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.951065 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-logs" (OuterVolumeSpecName: "logs") pod "5c7bc6f5-2d6b-475e-b294-9141ee21ceac" (UID: "5c7bc6f5-2d6b-475e-b294-9141ee21ceac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.951270 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd72585e-bb84-41a5-bfde-d55a3978c294-scripts" (OuterVolumeSpecName: "scripts") pod "fd72585e-bb84-41a5-bfde-d55a3978c294" (UID: "fd72585e-bb84-41a5-bfde-d55a3978c294"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.951259 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-config-data" (OuterVolumeSpecName: "config-data") pod "fe7bd51c-f0e6-4059-9228-b64d3c39b6b8" (UID: "fe7bd51c-f0e6-4059-9228-b64d3c39b6b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.951368 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-logs" (OuterVolumeSpecName: "logs") pod "fe7bd51c-f0e6-4059-9228-b64d3c39b6b8" (UID: "fe7bd51c-f0e6-4059-9228-b64d3c39b6b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.951717 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-config-data" (OuterVolumeSpecName: "config-data") pod "5c7bc6f5-2d6b-475e-b294-9141ee21ceac" (UID: "5c7bc6f5-2d6b-475e-b294-9141ee21ceac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.954578 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd72585e-bb84-41a5-bfde-d55a3978c294-config-data" (OuterVolumeSpecName: "config-data") pod "fd72585e-bb84-41a5-bfde-d55a3978c294" (UID: "fd72585e-bb84-41a5-bfde-d55a3978c294"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.959914 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5c7bc6f5-2d6b-475e-b294-9141ee21ceac" (UID: "5c7bc6f5-2d6b-475e-b294-9141ee21ceac"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.960592 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd72585e-bb84-41a5-bfde-d55a3978c294-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fd72585e-bb84-41a5-bfde-d55a3978c294" (UID: "fd72585e-bb84-41a5-bfde-d55a3978c294"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.960652 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd72585e-bb84-41a5-bfde-d55a3978c294-kube-api-access-wsjq6" (OuterVolumeSpecName: "kube-api-access-wsjq6") pod "fd72585e-bb84-41a5-bfde-d55a3978c294" (UID: "fd72585e-bb84-41a5-bfde-d55a3978c294"). InnerVolumeSpecName "kube-api-access-wsjq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.961051 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd72585e-bb84-41a5-bfde-d55a3978c294-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.961092 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd72585e-bb84-41a5-bfde-d55a3978c294-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.961104 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.961116 4835 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd72585e-bb84-41a5-bfde-d55a3978c294-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.961128 4835 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.961141 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.961150 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.961161 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.961171 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.961971 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-kube-api-access-xmrfl" (OuterVolumeSpecName: "kube-api-access-xmrfl") pod "fe7bd51c-f0e6-4059-9228-b64d3c39b6b8" (UID: "fe7bd51c-f0e6-4059-9228-b64d3c39b6b8"). InnerVolumeSpecName "kube-api-access-xmrfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.962009 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd72585e-bb84-41a5-bfde-d55a3978c294-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.962022 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsjq6\" (UniqueName: \"kubernetes.io/projected/fd72585e-bb84-41a5-bfde-d55a3978c294-kube-api-access-wsjq6\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.962037 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.962327 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fe7bd51c-f0e6-4059-9228-b64d3c39b6b8" (UID: "fe7bd51c-f0e6-4059-9228-b64d3c39b6b8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:37 crc kubenswrapper[4835]: I1003 18:32:37.965168 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-kube-api-access-8djxn" (OuterVolumeSpecName: "kube-api-access-8djxn") pod "5c7bc6f5-2d6b-475e-b294-9141ee21ceac" (UID: "5c7bc6f5-2d6b-475e-b294-9141ee21ceac"). InnerVolumeSpecName "kube-api-access-8djxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.014149 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85456c75b5-fc7vc" event={"ID":"fe7bd51c-f0e6-4059-9228-b64d3c39b6b8","Type":"ContainerDied","Data":"1617842e8b4f29ffa937b8e15ce03375c4b6dc4f6b9942c62b8d4d4068d08c5b"} Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.014162 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85456c75b5-fc7vc" Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.026295 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59ddd97667-jf4l4" Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.026331 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59ddd97667-jf4l4" event={"ID":"fd72585e-bb84-41a5-bfde-d55a3978c294","Type":"ContainerDied","Data":"5da3b46e51366cf2d2d0bfc675fe83f681f598c4378c0b47c12778c7ffdb5665"} Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.027722 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57778c7597-llbt4" event={"ID":"5c7bc6f5-2d6b-475e-b294-9141ee21ceac","Type":"ContainerDied","Data":"ffdff434e9b619b31363616e833164b096022fe0345b86de98922460713ae118"} Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.027790 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57778c7597-llbt4" Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.063876 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8djxn\" (UniqueName: \"kubernetes.io/projected/5c7bc6f5-2d6b-475e-b294-9141ee21ceac-kube-api-access-8djxn\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.063904 4835 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.063914 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmrfl\" (UniqueName: \"kubernetes.io/projected/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8-kube-api-access-xmrfl\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.082210 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85456c75b5-fc7vc"] Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.099728 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-85456c75b5-fc7vc"] Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.123874 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59ddd97667-jf4l4"] Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.131693 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-59ddd97667-jf4l4"] Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.147159 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57778c7597-llbt4"] Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.154079 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-57778c7597-llbt4"] Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.897886 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7bc6f5-2d6b-475e-b294-9141ee21ceac" path="/var/lib/kubelet/pods/5c7bc6f5-2d6b-475e-b294-9141ee21ceac/volumes" Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.898652 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd72585e-bb84-41a5-bfde-d55a3978c294" path="/var/lib/kubelet/pods/fd72585e-bb84-41a5-bfde-d55a3978c294/volumes" Oct 03 18:32:38 crc kubenswrapper[4835]: I1003 18:32:38.899099 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe7bd51c-f0e6-4059-9228-b64d3c39b6b8" path="/var/lib/kubelet/pods/fe7bd51c-f0e6-4059-9228-b64d3c39b6b8/volumes" Oct 03 18:32:39 crc kubenswrapper[4835]: E1003 18:32:39.300775 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 03 18:32:39 crc kubenswrapper[4835]: E1003 18:32:39.300819 4835 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 03 18:32:39 crc kubenswrapper[4835]: E1003 18:32:39.300941 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.82:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-22cdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-dpk2w_openstack(705966b1-0d0b-4c12-9cc1-830277fcf80c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 18:32:39 crc kubenswrapper[4835]: E1003 18:32:39.302155 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-dpk2w" podUID="705966b1-0d0b-4c12-9cc1-830277fcf80c" Oct 03 18:32:39 crc kubenswrapper[4835]: I1003 18:32:39.802673 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84859df966-b4t26"] Oct 03 18:32:39 crc kubenswrapper[4835]: I1003 18:32:39.906318 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-d274k"] Oct 03 18:32:39 crc kubenswrapper[4835]: W1003 18:32:39.919108 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde5d465a_f009_4cef_940e_3b2aaa64468b.slice/crio-56c851691059668dc7e722bad63f58ebc407d51feb1ac1b597a2a0e0b6897174 WatchSource:0}: Error finding container 56c851691059668dc7e722bad63f58ebc407d51feb1ac1b597a2a0e0b6897174: Status 404 returned error can't find the container with id 56c851691059668dc7e722bad63f58ebc407d51feb1ac1b597a2a0e0b6897174 Oct 03 18:32:39 crc kubenswrapper[4835]: I1003 18:32:39.919480 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64dcfd48b6-tpcpd"] Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.025271 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t2cnf"] Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.048763 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cc75da7c-52a7-421a-b04f-e9269a316a2e","Type":"ContainerStarted","Data":"2272598796a815bd34b2dd4528987f4a5cd03ca1d8547260750c423f44ddd1f8"} Oct 03 18:32:40 crc kubenswrapper[4835]: W1003 18:32:40.052755 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a4da7b3_24ff_4f0a_8fd5_95e81f4e5d5a.slice/crio-b3e9ec9d5d783e5d54ea9bea486a9371c339669ee0c4478ce584a30863540d2f WatchSource:0}: Error finding container b3e9ec9d5d783e5d54ea9bea486a9371c339669ee0c4478ce584a30863540d2f: Status 404 returned error can't find the container with id b3e9ec9d5d783e5d54ea9bea486a9371c339669ee0c4478ce584a30863540d2f Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.058463 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"4870398d-de86-4dc6-9052-b6e80bfe27f5","Type":"ContainerStarted","Data":"0d08847f90439a606c24746338e1babf388d0f4e01e141e2fe52bc5dffda55b9"} Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.063575 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fz6rs" event={"ID":"709d622b-7993-4d18-8185-10b4f1c81d79","Type":"ContainerStarted","Data":"e80a414b0a4a919d1e0d59a1d6d41f19a37acaf0b810e3050b6700b1ac80c2cd"} Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.070961 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8110d0e5-9e19-4306-b8aa-babe937e8d2a","Type":"ContainerStarted","Data":"de105950d6ac7e7156a461005afa919948679bf395b58c050488f8cb4863a6cf"} Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.072626 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d274k" event={"ID":"4f142f3b-9cce-451e-82b0-bfdac3ec661c","Type":"ContainerStarted","Data":"66acab9f51780b5c23866d9c03343cc0e58cc37bb2b57fa34e46e79006182dcb"} Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.084961 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=6.511547474 podStartE2EDuration="33.084932648s" podCreationTimestamp="2025-10-03 18:32:07 +0000 UTC" firstStartedPulling="2025-10-03 18:32:09.646766918 +0000 UTC m=+1071.362707790" lastFinishedPulling="2025-10-03 18:32:36.220152092 +0000 UTC m=+1097.936092964" observedRunningTime="2025-10-03 18:32:40.077005623 +0000 UTC m=+1101.792946495" watchObservedRunningTime="2025-10-03 18:32:40.084932648 +0000 UTC m=+1101.800873520" Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.086236 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af75f57a-7612-48c8-b3fb-8594e81e2d0a","Type":"ContainerStarted","Data":"6af7b4aca2ab42921f496c3018bb159e5b3950431d0b3ea9aebf155918194167"} Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.087800 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xkgv4" event={"ID":"ddee52bd-4539-46b6-a51f-50fe9278665a","Type":"ContainerStarted","Data":"5cd877aa81327a4fb6f95fd381cf6d8fa29e66b4e8e1d3ca29fd19134069036c"} Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.098714 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=5.318008484 podStartE2EDuration="33.098695656s" podCreationTimestamp="2025-10-03 18:32:07 +0000 UTC" firstStartedPulling="2025-10-03 18:32:09.965894853 +0000 UTC m=+1071.681835725" lastFinishedPulling="2025-10-03 18:32:37.746582025 +0000 UTC m=+1099.462522897" observedRunningTime="2025-10-03 18:32:40.097499327 +0000 UTC m=+1101.813440229" watchObservedRunningTime="2025-10-03 18:32:40.098695656 +0000 UTC m=+1101.814636538" Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.104685 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" event={"ID":"9ed367e4-c09b-46d6-82d0-f43eb6c4417d","Type":"ContainerStarted","Data":"a4ce161e93558d68738f93dcb9b50888432950540aa7d4afb1f088b99380846f"} Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.105438 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.107608 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84859df966-b4t26" event={"ID":"dbf2013d-5dc5-4fe6-a408-08757b74ecc8","Type":"ContainerStarted","Data":"22822e5cc4e818c50ec9802d78bb1f29c86cf15cf1ab50592756d891ebf96299"} Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.108877 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64dcfd48b6-tpcpd" event={"ID":"de5d465a-f009-4cef-940e-3b2aaa64468b","Type":"ContainerStarted","Data":"56c851691059668dc7e722bad63f58ebc407d51feb1ac1b597a2a0e0b6897174"} Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.116332 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-fz6rs" podStartSLOduration=5.408168741 podStartE2EDuration="33.1163166s" podCreationTimestamp="2025-10-03 18:32:07 +0000 UTC" firstStartedPulling="2025-10-03 18:32:10.077189139 +0000 UTC m=+1071.793130011" lastFinishedPulling="2025-10-03 18:32:37.785336998 +0000 UTC m=+1099.501277870" observedRunningTime="2025-10-03 18:32:40.111177433 +0000 UTC m=+1101.827118315" watchObservedRunningTime="2025-10-03 18:32:40.1163166 +0000 UTC m=+1101.832257472" Oct 03 18:32:40 crc kubenswrapper[4835]: E1003 18:32:40.137348 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.82:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-dpk2w" podUID="705966b1-0d0b-4c12-9cc1-830277fcf80c" Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.153200 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" podStartSLOduration=31.153151885 podStartE2EDuration="31.153151885s" podCreationTimestamp="2025-10-03 18:32:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:40.150424188 +0000 UTC m=+1101.866365060" watchObservedRunningTime="2025-10-03 18:32:40.153151885 +0000 UTC m=+1101.869092757" Oct 03 18:32:40 crc kubenswrapper[4835]: I1003 18:32:40.173669 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-xkgv4" podStartSLOduration=3.5090039859999997 podStartE2EDuration="33.173649009s" podCreationTimestamp="2025-10-03 18:32:07 +0000 UTC" firstStartedPulling="2025-10-03 18:32:09.686759682 +0000 UTC m=+1071.402700554" lastFinishedPulling="2025-10-03 18:32:39.351404685 +0000 UTC m=+1101.067345577" observedRunningTime="2025-10-03 18:32:40.163002857 +0000 UTC m=+1101.878943729" watchObservedRunningTime="2025-10-03 18:32:40.173649009 +0000 UTC m=+1101.889589881" Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.125766 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cc75da7c-52a7-421a-b04f-e9269a316a2e","Type":"ContainerStarted","Data":"799212aecbac01f6c4f4cbdf3b3acb8691191075612225de77859a4661effa9c"} Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.126021 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cc75da7c-52a7-421a-b04f-e9269a316a2e" containerName="glance-log" containerID="cri-o://2272598796a815bd34b2dd4528987f4a5cd03ca1d8547260750c423f44ddd1f8" gracePeriod=30 Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.126388 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cc75da7c-52a7-421a-b04f-e9269a316a2e" containerName="glance-httpd" containerID="cri-o://799212aecbac01f6c4f4cbdf3b3acb8691191075612225de77859a4661effa9c" gracePeriod=30 Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.140348 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985","Type":"ContainerStarted","Data":"5b1e6e211be4549643e8d4bf942891e5069b0595fa1a166069d244e3a094592b"} Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.140395 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985","Type":"ContainerStarted","Data":"87c1396fe1c1128c2855a281cb7362b9c9732a8d22d937fa507167d051e731cf"} Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.140542 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" containerName="glance-log" containerID="cri-o://87c1396fe1c1128c2855a281cb7362b9c9732a8d22d937fa507167d051e731cf" gracePeriod=30 Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.140634 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" containerName="glance-httpd" containerID="cri-o://5b1e6e211be4549643e8d4bf942891e5069b0595fa1a166069d244e3a094592b" gracePeriod=30 Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.161533 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84859df966-b4t26" event={"ID":"dbf2013d-5dc5-4fe6-a408-08757b74ecc8","Type":"ContainerStarted","Data":"e5e130f9f60a7ea925e416ddcdc1f8d9dac2e1ab046a76a72203f396d842d878"} Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.161613 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84859df966-b4t26" event={"ID":"dbf2013d-5dc5-4fe6-a408-08757b74ecc8","Type":"ContainerStarted","Data":"d93c98bfae348a1d6fc0eb31ee4e25d9ed10fa39f233e56bac3150780f610179"} Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.166135 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t2cnf" event={"ID":"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a","Type":"ContainerStarted","Data":"612fafc7ff04680a4fe6b0807c59155113a4aefe07daa8e9ca167e389427b7c1"} Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.166176 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t2cnf" event={"ID":"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a","Type":"ContainerStarted","Data":"b3e9ec9d5d783e5d54ea9bea486a9371c339669ee0c4478ce584a30863540d2f"} Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.197624 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=31.19760465 podStartE2EDuration="31.19760465s" podCreationTimestamp="2025-10-03 18:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:41.166263558 +0000 UTC m=+1102.882204430" watchObservedRunningTime="2025-10-03 18:32:41.19760465 +0000 UTC m=+1102.913545522" Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.199206 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=31.199199549 podStartE2EDuration="31.199199549s" podCreationTimestamp="2025-10-03 18:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:41.145652377 +0000 UTC m=+1102.861593249" watchObservedRunningTime="2025-10-03 18:32:41.199199549 +0000 UTC m=+1102.915140421" Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.202772 4835 generic.go:334] "Generic (PLEG): container finished" podID="1b461fe4-edb9-423d-b17e-7cb251f7fc0d" containerID="c195933c0ae0d96122002799414129f837d005b33823bc491e94f174a16a025e" exitCode=137 Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.202833 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"1b461fe4-edb9-423d-b17e-7cb251f7fc0d","Type":"ContainerDied","Data":"c195933c0ae0d96122002799414129f837d005b33823bc491e94f174a16a025e"} Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.213416 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d274k" event={"ID":"4f142f3b-9cce-451e-82b0-bfdac3ec661c","Type":"ContainerStarted","Data":"77b55f5478320213dad5b94639c84645d6fcdc901ad041694eeb730d015016f5"} Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.219184 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64dcfd48b6-tpcpd" event={"ID":"de5d465a-f009-4cef-940e-3b2aaa64468b","Type":"ContainerStarted","Data":"83e641e13eea890082718c9960c998038c38e8137ea51f5b598f75b184ca52c4"} Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.219220 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64dcfd48b6-tpcpd" event={"ID":"de5d465a-f009-4cef-940e-3b2aaa64468b","Type":"ContainerStarted","Data":"c98cca13a795e8c49418d2713ce1f52d7c5add2a8a45469c7d0d5ca2a4207bec"} Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.257318 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-t2cnf" podStartSLOduration=23.257293493 podStartE2EDuration="23.257293493s" podCreationTimestamp="2025-10-03 18:32:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:41.208156748 +0000 UTC m=+1102.924097620" watchObservedRunningTime="2025-10-03 18:32:41.257293493 +0000 UTC m=+1102.973234365" Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.279557 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-84859df966-b4t26" podStartSLOduration=21.211678026 podStartE2EDuration="21.279537294s" podCreationTimestamp="2025-10-03 18:32:20 +0000 UTC" firstStartedPulling="2025-10-03 18:32:39.820236401 +0000 UTC m=+1101.536177273" lastFinishedPulling="2025-10-03 18:32:39.888095669 +0000 UTC m=+1101.604036541" observedRunningTime="2025-10-03 18:32:41.229145088 +0000 UTC m=+1102.945085980" watchObservedRunningTime="2025-10-03 18:32:41.279537294 +0000 UTC m=+1102.995478166" Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.287699 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-d274k" podStartSLOduration=24.287657291 podStartE2EDuration="24.287657291s" podCreationTimestamp="2025-10-03 18:32:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:41.247217817 +0000 UTC m=+1102.963158689" watchObservedRunningTime="2025-10-03 18:32:41.287657291 +0000 UTC m=+1103.003598163" Oct 03 18:32:41 crc kubenswrapper[4835]: I1003 18:32:41.299250 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-64dcfd48b6-tpcpd" podStartSLOduration=21.299232033 podStartE2EDuration="21.299232033s" podCreationTimestamp="2025-10-03 18:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:41.26869425 +0000 UTC m=+1102.984635122" watchObservedRunningTime="2025-10-03 18:32:41.299232033 +0000 UTC m=+1103.015172905" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.244227 4835 generic.go:334] "Generic (PLEG): container finished" podID="cc75da7c-52a7-421a-b04f-e9269a316a2e" containerID="799212aecbac01f6c4f4cbdf3b3acb8691191075612225de77859a4661effa9c" exitCode=0 Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.244521 4835 generic.go:334] "Generic (PLEG): container finished" podID="cc75da7c-52a7-421a-b04f-e9269a316a2e" containerID="2272598796a815bd34b2dd4528987f4a5cd03ca1d8547260750c423f44ddd1f8" exitCode=143 Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.244305 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cc75da7c-52a7-421a-b04f-e9269a316a2e","Type":"ContainerDied","Data":"799212aecbac01f6c4f4cbdf3b3acb8691191075612225de77859a4661effa9c"} Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.244605 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cc75da7c-52a7-421a-b04f-e9269a316a2e","Type":"ContainerDied","Data":"2272598796a815bd34b2dd4528987f4a5cd03ca1d8547260750c423f44ddd1f8"} Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.246871 4835 generic.go:334] "Generic (PLEG): container finished" podID="f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" containerID="5b1e6e211be4549643e8d4bf942891e5069b0595fa1a166069d244e3a094592b" exitCode=143 Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.246907 4835 generic.go:334] "Generic (PLEG): container finished" podID="f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" containerID="87c1396fe1c1128c2855a281cb7362b9c9732a8d22d937fa507167d051e731cf" exitCode=143 Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.247002 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985","Type":"ContainerDied","Data":"5b1e6e211be4549643e8d4bf942891e5069b0595fa1a166069d244e3a094592b"} Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.247044 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985","Type":"ContainerDied","Data":"87c1396fe1c1128c2855a281cb7362b9c9732a8d22d937fa507167d051e731cf"} Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.402461 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.462565 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5z7m\" (UniqueName: \"kubernetes.io/projected/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-kube-api-access-b5z7m\") pod \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.462682 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-combined-ca-bundle\") pod \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.462734 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-custom-prometheus-ca\") pod \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.462757 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-logs\") pod \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.462794 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-config-data\") pod \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\" (UID: \"1b461fe4-edb9-423d-b17e-7cb251f7fc0d\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.463829 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-logs" (OuterVolumeSpecName: "logs") pod "1b461fe4-edb9-423d-b17e-7cb251f7fc0d" (UID: "1b461fe4-edb9-423d-b17e-7cb251f7fc0d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.468396 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-kube-api-access-b5z7m" (OuterVolumeSpecName: "kube-api-access-b5z7m") pod "1b461fe4-edb9-423d-b17e-7cb251f7fc0d" (UID: "1b461fe4-edb9-423d-b17e-7cb251f7fc0d"). InnerVolumeSpecName "kube-api-access-b5z7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.503452 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b461fe4-edb9-423d-b17e-7cb251f7fc0d" (UID: "1b461fe4-edb9-423d-b17e-7cb251f7fc0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.564433 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.564463 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5z7m\" (UniqueName: \"kubernetes.io/projected/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-kube-api-access-b5z7m\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.564476 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.567851 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-config-data" (OuterVolumeSpecName: "config-data") pod "1b461fe4-edb9-423d-b17e-7cb251f7fc0d" (UID: "1b461fe4-edb9-423d-b17e-7cb251f7fc0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.608739 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "1b461fe4-edb9-423d-b17e-7cb251f7fc0d" (UID: "1b461fe4-edb9-423d-b17e-7cb251f7fc0d"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.668495 4835 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.669128 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b461fe4-edb9-423d-b17e-7cb251f7fc0d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.670462 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.678690 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771163 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-combined-ca-bundle\") pod \"cc75da7c-52a7-421a-b04f-e9269a316a2e\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771218 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc75da7c-52a7-421a-b04f-e9269a316a2e-logs\") pod \"cc75da7c-52a7-421a-b04f-e9269a316a2e\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771245 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-scripts\") pod \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771267 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnzzz\" (UniqueName: \"kubernetes.io/projected/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-kube-api-access-vnzzz\") pod \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771324 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-config-data\") pod \"cc75da7c-52a7-421a-b04f-e9269a316a2e\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771346 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-scripts\") pod \"cc75da7c-52a7-421a-b04f-e9269a316a2e\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771366 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-combined-ca-bundle\") pod \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771396 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-httpd-run\") pod \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771412 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-logs\") pod \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771428 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc75da7c-52a7-421a-b04f-e9269a316a2e-httpd-run\") pod \"cc75da7c-52a7-421a-b04f-e9269a316a2e\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771453 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-config-data\") pod \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771482 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\" (UID: \"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771495 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cc75da7c-52a7-421a-b04f-e9269a316a2e\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.771534 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp525\" (UniqueName: \"kubernetes.io/projected/cc75da7c-52a7-421a-b04f-e9269a316a2e-kube-api-access-hp525\") pod \"cc75da7c-52a7-421a-b04f-e9269a316a2e\" (UID: \"cc75da7c-52a7-421a-b04f-e9269a316a2e\") " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.776474 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-scripts" (OuterVolumeSpecName: "scripts") pod "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" (UID: "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.778496 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc75da7c-52a7-421a-b04f-e9269a316a2e-kube-api-access-hp525" (OuterVolumeSpecName: "kube-api-access-hp525") pod "cc75da7c-52a7-421a-b04f-e9269a316a2e" (UID: "cc75da7c-52a7-421a-b04f-e9269a316a2e"). InnerVolumeSpecName "kube-api-access-hp525". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.778736 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" (UID: "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.778897 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-logs" (OuterVolumeSpecName: "logs") pod "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" (UID: "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.779060 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc75da7c-52a7-421a-b04f-e9269a316a2e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cc75da7c-52a7-421a-b04f-e9269a316a2e" (UID: "cc75da7c-52a7-421a-b04f-e9269a316a2e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.780273 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-kube-api-access-vnzzz" (OuterVolumeSpecName: "kube-api-access-vnzzz") pod "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" (UID: "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985"). InnerVolumeSpecName "kube-api-access-vnzzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.782166 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-scripts" (OuterVolumeSpecName: "scripts") pod "cc75da7c-52a7-421a-b04f-e9269a316a2e" (UID: "cc75da7c-52a7-421a-b04f-e9269a316a2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.782851 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc75da7c-52a7-421a-b04f-e9269a316a2e-logs" (OuterVolumeSpecName: "logs") pod "cc75da7c-52a7-421a-b04f-e9269a316a2e" (UID: "cc75da7c-52a7-421a-b04f-e9269a316a2e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.783794 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" (UID: "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.784605 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "cc75da7c-52a7-421a-b04f-e9269a316a2e" (UID: "cc75da7c-52a7-421a-b04f-e9269a316a2e"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.802554 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc75da7c-52a7-421a-b04f-e9269a316a2e" (UID: "cc75da7c-52a7-421a-b04f-e9269a316a2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.809166 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" (UID: "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.825187 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-config-data" (OuterVolumeSpecName: "config-data") pod "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" (UID: "f13fd9ca-4ac9-4a8f-a3fa-e9efed841985"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.829625 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-config-data" (OuterVolumeSpecName: "config-data") pod "cc75da7c-52a7-421a-b04f-e9269a316a2e" (UID: "cc75da7c-52a7-421a-b04f-e9269a316a2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878249 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878289 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp525\" (UniqueName: \"kubernetes.io/projected/cc75da7c-52a7-421a-b04f-e9269a316a2e-kube-api-access-hp525\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878306 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878319 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc75da7c-52a7-421a-b04f-e9269a316a2e-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878331 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878342 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnzzz\" (UniqueName: \"kubernetes.io/projected/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-kube-api-access-vnzzz\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878354 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878365 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc75da7c-52a7-421a-b04f-e9269a316a2e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878376 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878387 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878399 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878411 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cc75da7c-52a7-421a-b04f-e9269a316a2e-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878422 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.878439 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.910453 4835 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.914861 4835 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.980262 4835 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:42 crc kubenswrapper[4835]: I1003 18:32:42.980286 4835 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.155178 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.257309 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cc75da7c-52a7-421a-b04f-e9269a316a2e","Type":"ContainerDied","Data":"108504af6d453dc22b521f4538aba32ad686097d494a94e1e029f40928fc5761"} Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.257386 4835 scope.go:117] "RemoveContainer" containerID="799212aecbac01f6c4f4cbdf3b3acb8691191075612225de77859a4661effa9c" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.257515 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.261667 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f13fd9ca-4ac9-4a8f-a3fa-e9efed841985","Type":"ContainerDied","Data":"453dc01bb715ba71aea2848cf78fd3202d74d999c1f54302122bbe03dc136565"} Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.261758 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.266336 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af75f57a-7612-48c8-b3fb-8594e81e2d0a","Type":"ContainerStarted","Data":"64b3d992f1132cf7ecff87b380648189301d52c52ecdb56ec440407b04213ed6"} Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.268718 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"1b461fe4-edb9-423d-b17e-7cb251f7fc0d","Type":"ContainerDied","Data":"30a15d82c21e1ce46eb6850426cfb175e63efa5c262a63942dcd598e85437e24"} Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.268793 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.290567 4835 scope.go:117] "RemoveContainer" containerID="2272598796a815bd34b2dd4528987f4a5cd03ca1d8547260750c423f44ddd1f8" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.294709 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.317630 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.335175 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.346212 4835 scope.go:117] "RemoveContainer" containerID="5b1e6e211be4549643e8d4bf942891e5069b0595fa1a166069d244e3a094592b" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.360305 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.372807 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:32:43 crc kubenswrapper[4835]: E1003 18:32:43.373281 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b461fe4-edb9-423d-b17e-7cb251f7fc0d" containerName="watcher-api" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.373302 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b461fe4-edb9-423d-b17e-7cb251f7fc0d" containerName="watcher-api" Oct 03 18:32:43 crc kubenswrapper[4835]: E1003 18:32:43.373311 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" containerName="glance-httpd" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.373318 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" containerName="glance-httpd" Oct 03 18:32:43 crc kubenswrapper[4835]: E1003 18:32:43.373335 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" containerName="glance-log" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.373349 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" containerName="glance-log" Oct 03 18:32:43 crc kubenswrapper[4835]: E1003 18:32:43.373359 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc75da7c-52a7-421a-b04f-e9269a316a2e" containerName="glance-log" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.373364 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc75da7c-52a7-421a-b04f-e9269a316a2e" containerName="glance-log" Oct 03 18:32:43 crc kubenswrapper[4835]: E1003 18:32:43.373374 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b461fe4-edb9-423d-b17e-7cb251f7fc0d" containerName="watcher-api-log" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.373380 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b461fe4-edb9-423d-b17e-7cb251f7fc0d" containerName="watcher-api-log" Oct 03 18:32:43 crc kubenswrapper[4835]: E1003 18:32:43.373387 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc75da7c-52a7-421a-b04f-e9269a316a2e" containerName="glance-httpd" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.373393 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc75da7c-52a7-421a-b04f-e9269a316a2e" containerName="glance-httpd" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.373575 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc75da7c-52a7-421a-b04f-e9269a316a2e" containerName="glance-httpd" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.373593 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b461fe4-edb9-423d-b17e-7cb251f7fc0d" containerName="watcher-api" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.373610 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc75da7c-52a7-421a-b04f-e9269a316a2e" containerName="glance-log" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.373618 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b461fe4-edb9-423d-b17e-7cb251f7fc0d" containerName="watcher-api-log" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.373628 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" containerName="glance-httpd" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.373645 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" containerName="glance-log" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.374766 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.403577 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-z6gzn" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.403866 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.409083 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.410330 4835 scope.go:117] "RemoveContainer" containerID="87c1396fe1c1128c2855a281cb7362b9c9732a8d22d937fa507167d051e731cf" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.410576 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.444185 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.450383 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.473438 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.482562 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.485724 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.509288 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04606cff-96b7-4cec-a55c-806267b559dc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.509354 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.509403 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7db9\" (UniqueName: \"kubernetes.io/projected/04606cff-96b7-4cec-a55c-806267b559dc-kube-api-access-k7db9\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.509480 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.509533 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04606cff-96b7-4cec-a55c-806267b559dc-logs\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.509552 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.509664 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.509704 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.511232 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.526811 4835 scope.go:117] "RemoveContainer" containerID="c195933c0ae0d96122002799414129f837d005b33823bc491e94f174a16a025e" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.529236 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.546044 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.548570 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.552293 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.552413 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.571175 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.593443 4835 scope.go:117] "RemoveContainer" containerID="d9c8616fe8126b38220803ebb361899f50433e63e4da73fc8c7800beed22c9e0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.612576 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04606cff-96b7-4cec-a55c-806267b559dc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.612656 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e190761-0aec-4401-a034-6d490b395fff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.612691 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.612735 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.612768 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.612796 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e190761-0aec-4401-a034-6d490b395fff-logs\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.612818 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-scripts\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.612843 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvhwr\" (UniqueName: \"kubernetes.io/projected/9e190761-0aec-4401-a034-6d490b395fff-kube-api-access-mvhwr\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.612874 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7db9\" (UniqueName: \"kubernetes.io/projected/04606cff-96b7-4cec-a55c-806267b559dc-kube-api-access-k7db9\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.613031 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-config-data\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.613059 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-config-data\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.613185 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.613227 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.613268 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.613334 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04606cff-96b7-4cec-a55c-806267b559dc-logs\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.613360 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.613389 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.613429 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.613452 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k66x8\" (UniqueName: \"kubernetes.io/projected/352d9656-3048-419a-9b51-c9e01d349bc6-kube-api-access-k66x8\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.613500 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/352d9656-3048-419a-9b51-c9e01d349bc6-logs\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.613545 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.614790 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04606cff-96b7-4cec-a55c-806267b559dc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.614950 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.615203 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04606cff-96b7-4cec-a55c-806267b559dc-logs\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.621206 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.633644 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.633695 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.640898 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.654512 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7db9\" (UniqueName: \"kubernetes.io/projected/04606cff-96b7-4cec-a55c-806267b559dc-kube-api-access-k7db9\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.707891 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715045 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715122 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715145 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e190761-0aec-4401-a034-6d490b395fff-logs\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715161 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-scripts\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715202 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvhwr\" (UniqueName: \"kubernetes.io/projected/9e190761-0aec-4401-a034-6d490b395fff-kube-api-access-mvhwr\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715237 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-config-data\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715271 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-config-data\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715311 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715348 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715383 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715413 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k66x8\" (UniqueName: \"kubernetes.io/projected/352d9656-3048-419a-9b51-c9e01d349bc6-kube-api-access-k66x8\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715441 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/352d9656-3048-419a-9b51-c9e01d349bc6-logs\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715492 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e190761-0aec-4401-a034-6d490b395fff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.715929 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e190761-0aec-4401-a034-6d490b395fff-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.716222 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.716579 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e190761-0aec-4401-a034-6d490b395fff-logs\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.718792 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/352d9656-3048-419a-9b51-c9e01d349bc6-logs\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.725529 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-scripts\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.725859 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.726979 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.726986 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.739023 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-config-data\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.740539 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.744894 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-config-data\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.745627 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k66x8\" (UniqueName: \"kubernetes.io/projected/352d9656-3048-419a-9b51-c9e01d349bc6-kube-api-access-k66x8\") pod \"watcher-api-0\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.772124 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvhwr\" (UniqueName: \"kubernetes.io/projected/9e190761-0aec-4401-a034-6d490b395fff-kube-api-access-mvhwr\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.772562 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " pod="openstack/glance-default-external-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.787511 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.819651 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 18:32:43 crc kubenswrapper[4835]: I1003 18:32:43.869875 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 18:32:44 crc kubenswrapper[4835]: I1003 18:32:44.515586 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:32:44 crc kubenswrapper[4835]: I1003 18:32:44.609988 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:32:44 crc kubenswrapper[4835]: I1003 18:32:44.723735 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:32:44 crc kubenswrapper[4835]: I1003 18:32:44.916222 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b461fe4-edb9-423d-b17e-7cb251f7fc0d" path="/var/lib/kubelet/pods/1b461fe4-edb9-423d-b17e-7cb251f7fc0d/volumes" Oct 03 18:32:44 crc kubenswrapper[4835]: I1003 18:32:44.917689 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc75da7c-52a7-421a-b04f-e9269a316a2e" path="/var/lib/kubelet/pods/cc75da7c-52a7-421a-b04f-e9269a316a2e/volumes" Oct 03 18:32:44 crc kubenswrapper[4835]: I1003 18:32:44.918511 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f13fd9ca-4ac9-4a8f-a3fa-e9efed841985" path="/var/lib/kubelet/pods/f13fd9ca-4ac9-4a8f-a3fa-e9efed841985/volumes" Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.077642 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.147175 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c66c75b9c-6v2j7"] Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.147405 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" podUID="9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" containerName="dnsmasq-dns" containerID="cri-o://7c61bc02b23bcd9fc6866001caa8e97586079c3bb9b456496ea093cc4186e4c0" gracePeriod=10 Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.323991 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e190761-0aec-4401-a034-6d490b395fff","Type":"ContainerStarted","Data":"08d1dd8665d2426c908543c92fe39adc5d69189fa8a83db883ea56c172633f4c"} Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.334115 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"352d9656-3048-419a-9b51-c9e01d349bc6","Type":"ContainerStarted","Data":"e6fe85f3bc5b3acea795f802bf4cfe16437a2b4a2188b9cd022d79cf731a0899"} Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.334156 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"352d9656-3048-419a-9b51-c9e01d349bc6","Type":"ContainerStarted","Data":"ea3368f2eb11d58090f8794af54d5b598bfca76a5de102607e9bf4f2204c1ca2"} Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.334166 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"352d9656-3048-419a-9b51-c9e01d349bc6","Type":"ContainerStarted","Data":"9341b386bc6205d5c954c3e318fe1e760b695dc92b9855d56d617fdfccbb7efe"} Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.335321 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.338327 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04606cff-96b7-4cec-a55c-806267b559dc","Type":"ContainerStarted","Data":"d65acc9a3ad64d0a66803b61a59ed5067fe358c4a7ef3a830748203468e4703d"} Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.341006 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="352d9656-3048-419a-9b51-c9e01d349bc6" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.168:9322/\": dial tcp 10.217.0.168:9322: connect: connection refused" Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.347341 4835 generic.go:334] "Generic (PLEG): container finished" podID="9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" containerID="7c61bc02b23bcd9fc6866001caa8e97586079c3bb9b456496ea093cc4186e4c0" exitCode=0 Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.347390 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" event={"ID":"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406","Type":"ContainerDied","Data":"7c61bc02b23bcd9fc6866001caa8e97586079c3bb9b456496ea093cc4186e4c0"} Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.370415 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.370392704 podStartE2EDuration="2.370392704s" podCreationTimestamp="2025-10-03 18:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:45.361441066 +0000 UTC m=+1107.077381938" watchObservedRunningTime="2025-10-03 18:32:45.370392704 +0000 UTC m=+1107.086333576" Oct 03 18:32:45 crc kubenswrapper[4835]: I1003 18:32:45.873579 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.003553 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jk6k\" (UniqueName: \"kubernetes.io/projected/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-kube-api-access-5jk6k\") pod \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.003703 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-dns-swift-storage-0\") pod \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.003736 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-config\") pod \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.003772 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-ovsdbserver-sb\") pod \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.003807 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-dns-svc\") pod \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.003829 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-ovsdbserver-nb\") pod \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\" (UID: \"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406\") " Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.012265 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-kube-api-access-5jk6k" (OuterVolumeSpecName: "kube-api-access-5jk6k") pod "9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" (UID: "9e4ebb1c-34bd-44d8-a7c8-a69cb4856406"). InnerVolumeSpecName "kube-api-access-5jk6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.050366 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-config" (OuterVolumeSpecName: "config") pod "9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" (UID: "9e4ebb1c-34bd-44d8-a7c8-a69cb4856406"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.055998 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" (UID: "9e4ebb1c-34bd-44d8-a7c8-a69cb4856406"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.092995 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" (UID: "9e4ebb1c-34bd-44d8-a7c8-a69cb4856406"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.094514 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" (UID: "9e4ebb1c-34bd-44d8-a7c8-a69cb4856406"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.106238 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.106267 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.106281 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.106290 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jk6k\" (UniqueName: \"kubernetes.io/projected/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-kube-api-access-5jk6k\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.106300 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.137326 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" (UID: "9e4ebb1c-34bd-44d8-a7c8-a69cb4856406"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.207950 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.363887 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04606cff-96b7-4cec-a55c-806267b559dc","Type":"ContainerStarted","Data":"0010c15727a0daf6e38ac8b5d622875a87191a587dfa9535b109c0c7f42377d8"} Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.367413 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.367469 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c66c75b9c-6v2j7" event={"ID":"9e4ebb1c-34bd-44d8-a7c8-a69cb4856406","Type":"ContainerDied","Data":"4528d4c1f22a5591e42782b7b1d3c97c83a9364fffb62d55ee5ca95a5ccbbff7"} Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.367516 4835 scope.go:117] "RemoveContainer" containerID="7c61bc02b23bcd9fc6866001caa8e97586079c3bb9b456496ea093cc4186e4c0" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.369803 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e190761-0aec-4401-a034-6d490b395fff","Type":"ContainerStarted","Data":"1dc43fdb4cd3d3dec291423d45d74832fe7d50111ba1ce4278899ee31c21ba9e"} Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.374482 4835 generic.go:334] "Generic (PLEG): container finished" podID="4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a" containerID="612fafc7ff04680a4fe6b0807c59155113a4aefe07daa8e9ca167e389427b7c1" exitCode=0 Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.375249 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t2cnf" event={"ID":"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a","Type":"ContainerDied","Data":"612fafc7ff04680a4fe6b0807c59155113a4aefe07daa8e9ca167e389427b7c1"} Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.406172 4835 scope.go:117] "RemoveContainer" containerID="09637d74d03dd5d26157f5fb908c8c2ddcdc1d2babcc7bbb74a92863ac5f3901" Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.413577 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c66c75b9c-6v2j7"] Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.432372 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c66c75b9c-6v2j7"] Oct 03 18:32:46 crc kubenswrapper[4835]: I1003 18:32:46.892625 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" path="/var/lib/kubelet/pods/9e4ebb1c-34bd-44d8-a7c8-a69cb4856406/volumes" Oct 03 18:32:47 crc kubenswrapper[4835]: I1003 18:32:47.387612 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e190761-0aec-4401-a034-6d490b395fff","Type":"ContainerStarted","Data":"cb752a1534060eb91749498398853c16725bd84e16743d4c1fb0ebf537841452"} Oct 03 18:32:47 crc kubenswrapper[4835]: I1003 18:32:47.391695 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04606cff-96b7-4cec-a55c-806267b559dc","Type":"ContainerStarted","Data":"f7ebab970bb4e77e30fcf09a64074147373976e59473678be8aaee33eb783b59"} Oct 03 18:32:47 crc kubenswrapper[4835]: I1003 18:32:47.419366 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.419280534 podStartE2EDuration="4.419280534s" podCreationTimestamp="2025-10-03 18:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:47.410717046 +0000 UTC m=+1109.126657918" watchObservedRunningTime="2025-10-03 18:32:47.419280534 +0000 UTC m=+1109.135221416" Oct 03 18:32:47 crc kubenswrapper[4835]: I1003 18:32:47.447627 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.447604943 podStartE2EDuration="4.447604943s" podCreationTimestamp="2025-10-03 18:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:47.436544443 +0000 UTC m=+1109.152485335" watchObservedRunningTime="2025-10-03 18:32:47.447604943 +0000 UTC m=+1109.163545815" Oct 03 18:32:48 crc kubenswrapper[4835]: I1003 18:32:48.155245 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 03 18:32:48 crc kubenswrapper[4835]: I1003 18:32:48.165919 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 18:32:48 crc kubenswrapper[4835]: E1003 18:32:48.166389 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de105950d6ac7e7156a461005afa919948679bf395b58c050488f8cb4863a6cf is running failed: container process not found" containerID="de105950d6ac7e7156a461005afa919948679bf395b58c050488f8cb4863a6cf" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 03 18:32:48 crc kubenswrapper[4835]: E1003 18:32:48.166716 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de105950d6ac7e7156a461005afa919948679bf395b58c050488f8cb4863a6cf is running failed: container process not found" containerID="de105950d6ac7e7156a461005afa919948679bf395b58c050488f8cb4863a6cf" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 03 18:32:48 crc kubenswrapper[4835]: E1003 18:32:48.167029 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de105950d6ac7e7156a461005afa919948679bf395b58c050488f8cb4863a6cf is running failed: container process not found" containerID="de105950d6ac7e7156a461005afa919948679bf395b58c050488f8cb4863a6cf" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 03 18:32:48 crc kubenswrapper[4835]: E1003 18:32:48.167129 4835 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of de105950d6ac7e7156a461005afa919948679bf395b58c050488f8cb4863a6cf is running failed: container process not found" probeType="Startup" pod="openstack/watcher-decision-engine-0" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:32:48 crc kubenswrapper[4835]: I1003 18:32:48.181488 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 03 18:32:48 crc kubenswrapper[4835]: I1003 18:32:48.403290 4835 generic.go:334] "Generic (PLEG): container finished" podID="ddee52bd-4539-46b6-a51f-50fe9278665a" containerID="5cd877aa81327a4fb6f95fd381cf6d8fa29e66b4e8e1d3ca29fd19134069036c" exitCode=0 Oct 03 18:32:48 crc kubenswrapper[4835]: I1003 18:32:48.403364 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xkgv4" event={"ID":"ddee52bd-4539-46b6-a51f-50fe9278665a","Type":"ContainerDied","Data":"5cd877aa81327a4fb6f95fd381cf6d8fa29e66b4e8e1d3ca29fd19134069036c"} Oct 03 18:32:48 crc kubenswrapper[4835]: I1003 18:32:48.404923 4835 generic.go:334] "Generic (PLEG): container finished" podID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerID="de105950d6ac7e7156a461005afa919948679bf395b58c050488f8cb4863a6cf" exitCode=1 Oct 03 18:32:48 crc kubenswrapper[4835]: I1003 18:32:48.404962 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8110d0e5-9e19-4306-b8aa-babe937e8d2a","Type":"ContainerDied","Data":"de105950d6ac7e7156a461005afa919948679bf395b58c050488f8cb4863a6cf"} Oct 03 18:32:48 crc kubenswrapper[4835]: I1003 18:32:48.405962 4835 scope.go:117] "RemoveContainer" containerID="de105950d6ac7e7156a461005afa919948679bf395b58c050488f8cb4863a6cf" Oct 03 18:32:48 crc kubenswrapper[4835]: I1003 18:32:48.455167 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 03 18:32:48 crc kubenswrapper[4835]: I1003 18:32:48.484754 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 03 18:32:48 crc kubenswrapper[4835]: I1003 18:32:48.819699 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 03 18:32:48 crc kubenswrapper[4835]: I1003 18:32:48.908661 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 03 18:32:49 crc kubenswrapper[4835]: I1003 18:32:49.422998 4835 generic.go:334] "Generic (PLEG): container finished" podID="709d622b-7993-4d18-8185-10b4f1c81d79" containerID="e80a414b0a4a919d1e0d59a1d6d41f19a37acaf0b810e3050b6700b1ac80c2cd" exitCode=0 Oct 03 18:32:49 crc kubenswrapper[4835]: I1003 18:32:49.423115 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fz6rs" event={"ID":"709d622b-7993-4d18-8185-10b4f1c81d79","Type":"ContainerDied","Data":"e80a414b0a4a919d1e0d59a1d6d41f19a37acaf0b810e3050b6700b1ac80c2cd"} Oct 03 18:32:50 crc kubenswrapper[4835]: I1003 18:32:50.431438 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="4870398d-de86-4dc6-9052-b6e80bfe27f5" containerName="watcher-applier" containerID="cri-o://0d08847f90439a606c24746338e1babf388d0f4e01e141e2fe52bc5dffda55b9" gracePeriod=30 Oct 03 18:32:50 crc kubenswrapper[4835]: I1003 18:32:50.741296 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:50 crc kubenswrapper[4835]: I1003 18:32:50.741335 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:32:50 crc kubenswrapper[4835]: I1003 18:32:50.742469 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64dcfd48b6-tpcpd" podUID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.165:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.165:8443: connect: connection refused" Oct 03 18:32:50 crc kubenswrapper[4835]: I1003 18:32:50.816314 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:50 crc kubenswrapper[4835]: I1003 18:32:50.816325 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-84859df966-b4t26" podUID="dbf2013d-5dc5-4fe6-a408-08757b74ecc8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.166:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.166:8443: connect: connection refused" Oct 03 18:32:50 crc kubenswrapper[4835]: I1003 18:32:50.816361 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-84859df966-b4t26" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.392851 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.432702 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-credential-keys\") pod \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.432769 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-fernet-keys\") pod \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.432838 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-scripts\") pod \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.432928 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-combined-ca-bundle\") pod \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.433036 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-config-data\") pod \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.433059 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7gpr\" (UniqueName: \"kubernetes.io/projected/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-kube-api-access-b7gpr\") pod \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\" (UID: \"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.435322 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fz6rs" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.441062 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-kube-api-access-b7gpr" (OuterVolumeSpecName: "kube-api-access-b7gpr") pod "4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a" (UID: "4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a"). InnerVolumeSpecName "kube-api-access-b7gpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.449786 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xkgv4" event={"ID":"ddee52bd-4539-46b6-a51f-50fe9278665a","Type":"ContainerDied","Data":"347d75c5230270d1f8503c10e9e0071b51c556f0b0ca6529f95bf6a95b9369a1"} Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.449824 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="347d75c5230270d1f8503c10e9e0071b51c556f0b0ca6529f95bf6a95b9369a1" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.452773 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a" (UID: "4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.462527 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-scripts" (OuterVolumeSpecName: "scripts") pod "4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a" (UID: "4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.464439 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fz6rs" event={"ID":"709d622b-7993-4d18-8185-10b4f1c81d79","Type":"ContainerDied","Data":"e2fd3dc327cf9dd3fa12882bcdd92f5012261969d1f0901eb8e388812eb47729"} Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.464478 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2fd3dc327cf9dd3fa12882bcdd92f5012261969d1f0901eb8e388812eb47729" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.464549 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fz6rs" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.465270 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a" (UID: "4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.466648 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t2cnf" event={"ID":"4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a","Type":"ContainerDied","Data":"b3e9ec9d5d783e5d54ea9bea486a9371c339669ee0c4478ce584a30863540d2f"} Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.466674 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3e9ec9d5d783e5d54ea9bea486a9371c339669ee0c4478ce584a30863540d2f" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.466808 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t2cnf" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.510638 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-config-data" (OuterVolumeSpecName: "config-data") pod "4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a" (UID: "4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.524290 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.530427 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a" (UID: "4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.534027 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/709d622b-7993-4d18-8185-10b4f1c81d79-db-sync-config-data\") pod \"709d622b-7993-4d18-8185-10b4f1c81d79\" (UID: \"709d622b-7993-4d18-8185-10b4f1c81d79\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.534290 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709d622b-7993-4d18-8185-10b4f1c81d79-combined-ca-bundle\") pod \"709d622b-7993-4d18-8185-10b4f1c81d79\" (UID: \"709d622b-7993-4d18-8185-10b4f1c81d79\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.534340 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb4l2\" (UniqueName: \"kubernetes.io/projected/709d622b-7993-4d18-8185-10b4f1c81d79-kube-api-access-qb4l2\") pod \"709d622b-7993-4d18-8185-10b4f1c81d79\" (UID: \"709d622b-7993-4d18-8185-10b4f1c81d79\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.534694 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.534712 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.534722 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.534732 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7gpr\" (UniqueName: \"kubernetes.io/projected/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-kube-api-access-b7gpr\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.534740 4835 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.534748 4835 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.537491 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709d622b-7993-4d18-8185-10b4f1c81d79-kube-api-access-qb4l2" (OuterVolumeSpecName: "kube-api-access-qb4l2") pod "709d622b-7993-4d18-8185-10b4f1c81d79" (UID: "709d622b-7993-4d18-8185-10b4f1c81d79"). InnerVolumeSpecName "kube-api-access-qb4l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.540858 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709d622b-7993-4d18-8185-10b4f1c81d79-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "709d622b-7993-4d18-8185-10b4f1c81d79" (UID: "709d622b-7993-4d18-8185-10b4f1c81d79"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.578245 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/709d622b-7993-4d18-8185-10b4f1c81d79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "709d622b-7993-4d18-8185-10b4f1c81d79" (UID: "709d622b-7993-4d18-8185-10b4f1c81d79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.636022 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddee52bd-4539-46b6-a51f-50fe9278665a-logs\") pod \"ddee52bd-4539-46b6-a51f-50fe9278665a\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.636150 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-scripts\") pod \"ddee52bd-4539-46b6-a51f-50fe9278665a\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.636299 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-combined-ca-bundle\") pod \"ddee52bd-4539-46b6-a51f-50fe9278665a\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.636343 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-config-data\") pod \"ddee52bd-4539-46b6-a51f-50fe9278665a\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.636376 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb5zf\" (UniqueName: \"kubernetes.io/projected/ddee52bd-4539-46b6-a51f-50fe9278665a-kube-api-access-bb5zf\") pod \"ddee52bd-4539-46b6-a51f-50fe9278665a\" (UID: \"ddee52bd-4539-46b6-a51f-50fe9278665a\") " Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.636747 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/709d622b-7993-4d18-8185-10b4f1c81d79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.636763 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb4l2\" (UniqueName: \"kubernetes.io/projected/709d622b-7993-4d18-8185-10b4f1c81d79-kube-api-access-qb4l2\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.636774 4835 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/709d622b-7993-4d18-8185-10b4f1c81d79-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.636808 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddee52bd-4539-46b6-a51f-50fe9278665a-logs" (OuterVolumeSpecName: "logs") pod "ddee52bd-4539-46b6-a51f-50fe9278665a" (UID: "ddee52bd-4539-46b6-a51f-50fe9278665a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.639202 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddee52bd-4539-46b6-a51f-50fe9278665a-kube-api-access-bb5zf" (OuterVolumeSpecName: "kube-api-access-bb5zf") pod "ddee52bd-4539-46b6-a51f-50fe9278665a" (UID: "ddee52bd-4539-46b6-a51f-50fe9278665a"). InnerVolumeSpecName "kube-api-access-bb5zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.639594 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-scripts" (OuterVolumeSpecName: "scripts") pod "ddee52bd-4539-46b6-a51f-50fe9278665a" (UID: "ddee52bd-4539-46b6-a51f-50fe9278665a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.659575 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddee52bd-4539-46b6-a51f-50fe9278665a" (UID: "ddee52bd-4539-46b6-a51f-50fe9278665a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.676026 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-config-data" (OuterVolumeSpecName: "config-data") pod "ddee52bd-4539-46b6-a51f-50fe9278665a" (UID: "ddee52bd-4539-46b6-a51f-50fe9278665a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.738630 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddee52bd-4539-46b6-a51f-50fe9278665a-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.738921 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.738932 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.738943 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddee52bd-4539-46b6-a51f-50fe9278665a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:51 crc kubenswrapper[4835]: I1003 18:32:51.738952 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb5zf\" (UniqueName: \"kubernetes.io/projected/ddee52bd-4539-46b6-a51f-50fe9278665a-kube-api-access-bb5zf\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.478524 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8110d0e5-9e19-4306-b8aa-babe937e8d2a","Type":"ContainerStarted","Data":"e806a91c1d0ed6742cb427f345aabf9762384e27525ea6c3dea58c86eb291aac"} Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.481064 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af75f57a-7612-48c8-b3fb-8594e81e2d0a","Type":"ContainerStarted","Data":"2d8b620a897b5436e95f47f581862450b09d7ca2e085f89555d7329fe76e3fa1"} Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.481118 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xkgv4" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.502166 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5c4db54587-knmn7"] Oct 03 18:32:52 crc kubenswrapper[4835]: E1003 18:32:52.502632 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a" containerName="keystone-bootstrap" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.502654 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a" containerName="keystone-bootstrap" Oct 03 18:32:52 crc kubenswrapper[4835]: E1003 18:32:52.502673 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" containerName="init" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.502684 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" containerName="init" Oct 03 18:32:52 crc kubenswrapper[4835]: E1003 18:32:52.502716 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddee52bd-4539-46b6-a51f-50fe9278665a" containerName="placement-db-sync" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.502724 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddee52bd-4539-46b6-a51f-50fe9278665a" containerName="placement-db-sync" Oct 03 18:32:52 crc kubenswrapper[4835]: E1003 18:32:52.502742 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709d622b-7993-4d18-8185-10b4f1c81d79" containerName="barbican-db-sync" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.502750 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="709d622b-7993-4d18-8185-10b4f1c81d79" containerName="barbican-db-sync" Oct 03 18:32:52 crc kubenswrapper[4835]: E1003 18:32:52.502780 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" containerName="dnsmasq-dns" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.502788 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" containerName="dnsmasq-dns" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.503018 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e4ebb1c-34bd-44d8-a7c8-a69cb4856406" containerName="dnsmasq-dns" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.503042 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a" containerName="keystone-bootstrap" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.503059 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddee52bd-4539-46b6-a51f-50fe9278665a" containerName="placement-db-sync" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.503096 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="709d622b-7993-4d18-8185-10b4f1c81d79" containerName="barbican-db-sync" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.503901 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.506848 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vd8gf" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.506908 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.506917 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.506932 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.506848 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.507230 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.532438 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5c4db54587-knmn7"] Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.553038 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-fernet-keys\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.553098 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-public-tls-certs\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.553183 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-scripts\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.553205 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-config-data\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.553264 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-credential-keys\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.553293 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d5lv\" (UniqueName: \"kubernetes.io/projected/522ccc9d-3dab-4ee6-8a2a-882de9d37457-kube-api-access-6d5lv\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.553317 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-combined-ca-bundle\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.553358 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-internal-tls-certs\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.655422 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-fernet-keys\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.655466 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-public-tls-certs\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.655526 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-scripts\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.655543 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-config-data\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.655580 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-credential-keys\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.655604 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d5lv\" (UniqueName: \"kubernetes.io/projected/522ccc9d-3dab-4ee6-8a2a-882de9d37457-kube-api-access-6d5lv\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.655625 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-combined-ca-bundle\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.655665 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-internal-tls-certs\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.660413 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-credential-keys\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.664027 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-config-data\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.664448 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-combined-ca-bundle\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.678563 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-scripts\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.679995 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-fernet-keys\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.681640 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-public-tls-certs\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.686381 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8585b7f888-6wk2b"] Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.686644 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/522ccc9d-3dab-4ee6-8a2a-882de9d37457-internal-tls-certs\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.688548 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d5lv\" (UniqueName: \"kubernetes.io/projected/522ccc9d-3dab-4ee6-8a2a-882de9d37457-kube-api-access-6d5lv\") pod \"keystone-5c4db54587-knmn7\" (UID: \"522ccc9d-3dab-4ee6-8a2a-882de9d37457\") " pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.690049 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.694623 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.694923 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.695153 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6gnk2" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.733720 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8585b7f888-6wk2b"] Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.746443 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-649c499755-ttlj6"] Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.747920 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.750063 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.757489 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42897a96-1d94-485f-9448-792d48138492-config-data\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.757567 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42897a96-1d94-485f-9448-792d48138492-logs\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.757624 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42897a96-1d94-485f-9448-792d48138492-combined-ca-bundle\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.757660 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m97px\" (UniqueName: \"kubernetes.io/projected/42897a96-1d94-485f-9448-792d48138492-kube-api-access-m97px\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.757680 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42897a96-1d94-485f-9448-792d48138492-config-data-custom\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.760924 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-799c89c95d-bzssk"] Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.767789 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.770381 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.770586 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.770878 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.770901 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.776520 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7wl7m" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.780555 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-649c499755-ttlj6"] Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.823695 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.861780 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-internal-tls-certs\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862105 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb97c512-09cc-43f3-8619-7083c7b803ff-logs\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862139 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42897a96-1d94-485f-9448-792d48138492-combined-ca-bundle\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862168 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-config-data\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862188 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m97px\" (UniqueName: \"kubernetes.io/projected/42897a96-1d94-485f-9448-792d48138492-kube-api-access-m97px\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862206 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-public-tls-certs\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862226 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42897a96-1d94-485f-9448-792d48138492-config-data-custom\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862254 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-combined-ca-bundle\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862272 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9343cae5-9ae1-4cae-b5a1-31acc9b34217-config-data\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862305 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4nbb\" (UniqueName: \"kubernetes.io/projected/9343cae5-9ae1-4cae-b5a1-31acc9b34217-kube-api-access-w4nbb\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862326 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mr9k\" (UniqueName: \"kubernetes.io/projected/eb97c512-09cc-43f3-8619-7083c7b803ff-kube-api-access-4mr9k\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862342 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9343cae5-9ae1-4cae-b5a1-31acc9b34217-logs\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862360 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42897a96-1d94-485f-9448-792d48138492-config-data\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862388 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-scripts\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862419 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9343cae5-9ae1-4cae-b5a1-31acc9b34217-combined-ca-bundle\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862436 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9343cae5-9ae1-4cae-b5a1-31acc9b34217-config-data-custom\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862455 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42897a96-1d94-485f-9448-792d48138492-logs\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.862746 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42897a96-1d94-485f-9448-792d48138492-logs\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.863161 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-799c89c95d-bzssk"] Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.874942 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42897a96-1d94-485f-9448-792d48138492-config-data\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.886802 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42897a96-1d94-485f-9448-792d48138492-combined-ca-bundle\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.896563 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m97px\" (UniqueName: \"kubernetes.io/projected/42897a96-1d94-485f-9448-792d48138492-kube-api-access-m97px\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.931231 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42897a96-1d94-485f-9448-792d48138492-config-data-custom\") pod \"barbican-keystone-listener-8585b7f888-6wk2b\" (UID: \"42897a96-1d94-485f-9448-792d48138492\") " pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.971949 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-config-data\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.972029 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-public-tls-certs\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.972126 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-combined-ca-bundle\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.972146 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9343cae5-9ae1-4cae-b5a1-31acc9b34217-config-data\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.972231 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4nbb\" (UniqueName: \"kubernetes.io/projected/9343cae5-9ae1-4cae-b5a1-31acc9b34217-kube-api-access-w4nbb\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.972274 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mr9k\" (UniqueName: \"kubernetes.io/projected/eb97c512-09cc-43f3-8619-7083c7b803ff-kube-api-access-4mr9k\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.972301 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9343cae5-9ae1-4cae-b5a1-31acc9b34217-logs\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.972362 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-scripts\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.972459 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9343cae5-9ae1-4cae-b5a1-31acc9b34217-combined-ca-bundle\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.972503 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9343cae5-9ae1-4cae-b5a1-31acc9b34217-config-data-custom\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.972576 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-internal-tls-certs\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.972644 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb97c512-09cc-43f3-8619-7083c7b803ff-logs\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.973209 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb97c512-09cc-43f3-8619-7083c7b803ff-logs\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.978596 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d487cf869-fnqtk"] Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.985213 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9343cae5-9ae1-4cae-b5a1-31acc9b34217-logs\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.992842 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-internal-tls-certs\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.993314 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-public-tls-certs\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.993839 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-config-data\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.994024 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-scripts\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.994937 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9343cae5-9ae1-4cae-b5a1-31acc9b34217-combined-ca-bundle\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.994940 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb97c512-09cc-43f3-8619-7083c7b803ff-combined-ca-bundle\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:52 crc kubenswrapper[4835]: I1003 18:32:52.999806 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d487cf869-fnqtk"] Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.000059 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.002626 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mr9k\" (UniqueName: \"kubernetes.io/projected/eb97c512-09cc-43f3-8619-7083c7b803ff-kube-api-access-4mr9k\") pod \"placement-799c89c95d-bzssk\" (UID: \"eb97c512-09cc-43f3-8619-7083c7b803ff\") " pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.006473 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9343cae5-9ae1-4cae-b5a1-31acc9b34217-config-data\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.008401 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4nbb\" (UniqueName: \"kubernetes.io/projected/9343cae5-9ae1-4cae-b5a1-31acc9b34217-kube-api-access-w4nbb\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.015429 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9343cae5-9ae1-4cae-b5a1-31acc9b34217-config-data-custom\") pod \"barbican-worker-649c499755-ttlj6\" (UID: \"9343cae5-9ae1-4cae-b5a1-31acc9b34217\") " pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.061209 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5b45756688-tbfsl"] Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.062830 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.067294 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.069152 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b45756688-tbfsl"] Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.116049 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.145770 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-649c499755-ttlj6" Oct 03 18:32:53 crc kubenswrapper[4835]: E1003 18:32:53.157523 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0d08847f90439a606c24746338e1babf388d0f4e01e141e2fe52bc5dffda55b9" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 18:32:53 crc kubenswrapper[4835]: E1003 18:32:53.163462 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0d08847f90439a606c24746338e1babf388d0f4e01e141e2fe52bc5dffda55b9" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.169491 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:53 crc kubenswrapper[4835]: E1003 18:32:53.169616 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0d08847f90439a606c24746338e1babf388d0f4e01e141e2fe52bc5dffda55b9" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 18:32:53 crc kubenswrapper[4835]: E1003 18:32:53.169677 4835 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="4870398d-de86-4dc6-9052-b6e80bfe27f5" containerName="watcher-applier" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.178627 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zprm9\" (UniqueName: \"kubernetes.io/projected/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-kube-api-access-zprm9\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.178676 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-ovsdbserver-nb\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.178701 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-config-data-custom\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.178759 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-config-data\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.178779 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4snv\" (UniqueName: \"kubernetes.io/projected/1505cc32-6896-425d-b36b-1b2d3504901b-kube-api-access-g4snv\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.178811 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-ovsdbserver-sb\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.178826 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-dns-swift-storage-0\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.178865 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1505cc32-6896-425d-b36b-1b2d3504901b-logs\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.178881 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-combined-ca-bundle\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.178903 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-config\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.178941 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-dns-svc\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.280295 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-config-data\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.280363 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4snv\" (UniqueName: \"kubernetes.io/projected/1505cc32-6896-425d-b36b-1b2d3504901b-kube-api-access-g4snv\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.280421 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-ovsdbserver-sb\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.280439 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-dns-swift-storage-0\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.280509 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1505cc32-6896-425d-b36b-1b2d3504901b-logs\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.280531 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-combined-ca-bundle\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.280576 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-config\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.280617 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-dns-svc\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.280664 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zprm9\" (UniqueName: \"kubernetes.io/projected/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-kube-api-access-zprm9\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.280690 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-ovsdbserver-nb\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.280731 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-config-data-custom\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.286498 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1505cc32-6896-425d-b36b-1b2d3504901b-logs\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.287376 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-ovsdbserver-sb\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.290267 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-config-data-custom\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.295776 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-dns-svc\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.297726 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-ovsdbserver-nb\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.308879 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-dns-swift-storage-0\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.311538 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-combined-ca-bundle\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.311841 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4snv\" (UniqueName: \"kubernetes.io/projected/1505cc32-6896-425d-b36b-1b2d3504901b-kube-api-access-g4snv\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.312302 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-config\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.312538 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-config-data\") pod \"barbican-api-5b45756688-tbfsl\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.320540 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zprm9\" (UniqueName: \"kubernetes.io/projected/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-kube-api-access-zprm9\") pod \"dnsmasq-dns-6d487cf869-fnqtk\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.343631 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.428565 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.590776 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5c4db54587-knmn7"] Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.621391 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-649c499755-ttlj6"] Oct 03 18:32:53 crc kubenswrapper[4835]: W1003 18:32:53.639798 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod522ccc9d_3dab_4ee6_8a2a_882de9d37457.slice/crio-a3711013cd4c28ea8c8ca24f2999801e6ebfe2dc00ecbd98b2fa9de13b3d38bc WatchSource:0}: Error finding container a3711013cd4c28ea8c8ca24f2999801e6ebfe2dc00ecbd98b2fa9de13b3d38bc: Status 404 returned error can't find the container with id a3711013cd4c28ea8c8ca24f2999801e6ebfe2dc00ecbd98b2fa9de13b3d38bc Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.793614 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.793659 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.820898 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.870705 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.870957 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.959489 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8585b7f888-6wk2b"] Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.966424 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 03 18:32:53 crc kubenswrapper[4835]: I1003 18:32:53.986882 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.022414 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.037459 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.041732 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-799c89c95d-bzssk"] Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.053983 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d487cf869-fnqtk"] Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.064916 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b45756688-tbfsl"] Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.101524 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.619173 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5c4db54587-knmn7" event={"ID":"522ccc9d-3dab-4ee6-8a2a-882de9d37457","Type":"ContainerStarted","Data":"b1c758b6b1b33ea1f0ae15db317a3f014b5d1f84ef3595bcb24c3ffb35ecbc29"} Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.619688 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5c4db54587-knmn7" event={"ID":"522ccc9d-3dab-4ee6-8a2a-882de9d37457","Type":"ContainerStarted","Data":"a3711013cd4c28ea8c8ca24f2999801e6ebfe2dc00ecbd98b2fa9de13b3d38bc"} Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.621148 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.635763 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b45756688-tbfsl" event={"ID":"1505cc32-6896-425d-b36b-1b2d3504901b","Type":"ContainerStarted","Data":"3786801d1aa6390ca652db72432821110148bfc2193c91680ff9359675379e1a"} Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.640024 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5c4db54587-knmn7" podStartSLOduration=2.638695227 podStartE2EDuration="2.638695227s" podCreationTimestamp="2025-10-03 18:32:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:54.638660137 +0000 UTC m=+1116.354601009" watchObservedRunningTime="2025-10-03 18:32:54.638695227 +0000 UTC m=+1116.354636099" Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.653307 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-649c499755-ttlj6" event={"ID":"9343cae5-9ae1-4cae-b5a1-31acc9b34217","Type":"ContainerStarted","Data":"f4459605e915845f7fb2896baa8594c6e712bd1a953de4a7b40b66ac4eeb4da2"} Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.658409 4835 generic.go:334] "Generic (PLEG): container finished" podID="4870398d-de86-4dc6-9052-b6e80bfe27f5" containerID="0d08847f90439a606c24746338e1babf388d0f4e01e141e2fe52bc5dffda55b9" exitCode=0 Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.658486 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"4870398d-de86-4dc6-9052-b6e80bfe27f5","Type":"ContainerDied","Data":"0d08847f90439a606c24746338e1babf388d0f4e01e141e2fe52bc5dffda55b9"} Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.660860 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" event={"ID":"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01","Type":"ContainerStarted","Data":"7c90bb4998a05feeeaaf7f7281c787361109edd31fa69728b7dd874cc299c107"} Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.679156 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-799c89c95d-bzssk" event={"ID":"eb97c512-09cc-43f3-8619-7083c7b803ff","Type":"ContainerStarted","Data":"2e78298cfa8c3be8e79565a650079d25de62e85cd46a806809ac329e726c2194"} Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.689636 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" event={"ID":"42897a96-1d94-485f-9448-792d48138492","Type":"ContainerStarted","Data":"398f07b3eb59fdfd95c7183eaa1c74939e0afa4490661700e176e8fb654a5ad1"} Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.690518 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.690542 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.690670 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.690697 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.694998 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 03 18:32:54 crc kubenswrapper[4835]: I1003 18:32:54.938337 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.035682 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4870398d-de86-4dc6-9052-b6e80bfe27f5-config-data\") pod \"4870398d-de86-4dc6-9052-b6e80bfe27f5\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.035735 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgn45\" (UniqueName: \"kubernetes.io/projected/4870398d-de86-4dc6-9052-b6e80bfe27f5-kube-api-access-pgn45\") pod \"4870398d-de86-4dc6-9052-b6e80bfe27f5\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.035787 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4870398d-de86-4dc6-9052-b6e80bfe27f5-logs\") pod \"4870398d-de86-4dc6-9052-b6e80bfe27f5\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.035851 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4870398d-de86-4dc6-9052-b6e80bfe27f5-combined-ca-bundle\") pod \"4870398d-de86-4dc6-9052-b6e80bfe27f5\" (UID: \"4870398d-de86-4dc6-9052-b6e80bfe27f5\") " Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.037413 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4870398d-de86-4dc6-9052-b6e80bfe27f5-logs" (OuterVolumeSpecName: "logs") pod "4870398d-de86-4dc6-9052-b6e80bfe27f5" (UID: "4870398d-de86-4dc6-9052-b6e80bfe27f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.043597 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4870398d-de86-4dc6-9052-b6e80bfe27f5-kube-api-access-pgn45" (OuterVolumeSpecName: "kube-api-access-pgn45") pod "4870398d-de86-4dc6-9052-b6e80bfe27f5" (UID: "4870398d-de86-4dc6-9052-b6e80bfe27f5"). InnerVolumeSpecName "kube-api-access-pgn45". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.072476 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4870398d-de86-4dc6-9052-b6e80bfe27f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4870398d-de86-4dc6-9052-b6e80bfe27f5" (UID: "4870398d-de86-4dc6-9052-b6e80bfe27f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.137909 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgn45\" (UniqueName: \"kubernetes.io/projected/4870398d-de86-4dc6-9052-b6e80bfe27f5-kube-api-access-pgn45\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.137929 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4870398d-de86-4dc6-9052-b6e80bfe27f5-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.137938 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4870398d-de86-4dc6-9052-b6e80bfe27f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.197556 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4870398d-de86-4dc6-9052-b6e80bfe27f5-config-data" (OuterVolumeSpecName: "config-data") pod "4870398d-de86-4dc6-9052-b6e80bfe27f5" (UID: "4870398d-de86-4dc6-9052-b6e80bfe27f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.239359 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4870398d-de86-4dc6-9052-b6e80bfe27f5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.706499 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b45756688-tbfsl" event={"ID":"1505cc32-6896-425d-b36b-1b2d3504901b","Type":"ContainerStarted","Data":"8ae53724fccd3725c727bddb721ac81a7d0297b53ee19fe8f4276c44febb0e65"} Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.706539 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b45756688-tbfsl" event={"ID":"1505cc32-6896-425d-b36b-1b2d3504901b","Type":"ContainerStarted","Data":"5260b2300fda23f80796cae4e675c763968e93c918171e583c4a45af8379ec71"} Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.707577 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.707599 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.711443 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.711869 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"4870398d-de86-4dc6-9052-b6e80bfe27f5","Type":"ContainerDied","Data":"00607928be178c9b851aac8640dc7b476c26b3c8a837e8f0b9a0b621cdc4118f"} Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.711931 4835 scope.go:117] "RemoveContainer" containerID="0d08847f90439a606c24746338e1babf388d0f4e01e141e2fe52bc5dffda55b9" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.732754 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5b45756688-tbfsl" podStartSLOduration=3.73274009 podStartE2EDuration="3.73274009s" podCreationTimestamp="2025-10-03 18:32:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:55.728344203 +0000 UTC m=+1117.444285075" watchObservedRunningTime="2025-10-03 18:32:55.73274009 +0000 UTC m=+1117.448680962" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.739495 4835 generic.go:334] "Generic (PLEG): container finished" podID="d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" containerID="25d2683dbee2ecf278f809adb920be370f8f8d45740841d4468b8bbcb695192a" exitCode=0 Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.739583 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" event={"ID":"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01","Type":"ContainerDied","Data":"25d2683dbee2ecf278f809adb920be370f8f8d45740841d4468b8bbcb695192a"} Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.755170 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.765318 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-799c89c95d-bzssk" event={"ID":"eb97c512-09cc-43f3-8619-7083c7b803ff","Type":"ContainerStarted","Data":"2555f40e7a264f4c2d2927c68046058de86bb2d465a2698aa6055de6b0d636b2"} Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.765366 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-799c89c95d-bzssk" event={"ID":"eb97c512-09cc-43f3-8619-7083c7b803ff","Type":"ContainerStarted","Data":"8b96628c3961a59a217aae11326e61213845d9db37adef12d7e12eb4d669f0f0"} Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.766577 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.767107 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.773984 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.796550 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 03 18:32:55 crc kubenswrapper[4835]: E1003 18:32:55.797049 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4870398d-de86-4dc6-9052-b6e80bfe27f5" containerName="watcher-applier" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.797086 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4870398d-de86-4dc6-9052-b6e80bfe27f5" containerName="watcher-applier" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.797293 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4870398d-de86-4dc6-9052-b6e80bfe27f5" containerName="watcher-applier" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.798425 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.800620 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.807093 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.844131 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-799c89c95d-bzssk" podStartSLOduration=3.84411315 podStartE2EDuration="3.84411315s" podCreationTimestamp="2025-10-03 18:32:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:55.81701725 +0000 UTC m=+1117.532958132" watchObservedRunningTime="2025-10-03 18:32:55.84411315 +0000 UTC m=+1117.560054022" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.859157 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a9c92f4-cd5c-4917-8ce8-5619892d5470-config-data\") pod \"watcher-applier-0\" (UID: \"7a9c92f4-cd5c-4917-8ce8-5619892d5470\") " pod="openstack/watcher-applier-0" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.859221 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a9c92f4-cd5c-4917-8ce8-5619892d5470-logs\") pod \"watcher-applier-0\" (UID: \"7a9c92f4-cd5c-4917-8ce8-5619892d5470\") " pod="openstack/watcher-applier-0" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.859273 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7nmt\" (UniqueName: \"kubernetes.io/projected/7a9c92f4-cd5c-4917-8ce8-5619892d5470-kube-api-access-j7nmt\") pod \"watcher-applier-0\" (UID: \"7a9c92f4-cd5c-4917-8ce8-5619892d5470\") " pod="openstack/watcher-applier-0" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.861193 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a9c92f4-cd5c-4917-8ce8-5619892d5470-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"7a9c92f4-cd5c-4917-8ce8-5619892d5470\") " pod="openstack/watcher-applier-0" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.964036 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a9c92f4-cd5c-4917-8ce8-5619892d5470-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"7a9c92f4-cd5c-4917-8ce8-5619892d5470\") " pod="openstack/watcher-applier-0" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.964642 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a9c92f4-cd5c-4917-8ce8-5619892d5470-config-data\") pod \"watcher-applier-0\" (UID: \"7a9c92f4-cd5c-4917-8ce8-5619892d5470\") " pod="openstack/watcher-applier-0" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.964695 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a9c92f4-cd5c-4917-8ce8-5619892d5470-logs\") pod \"watcher-applier-0\" (UID: \"7a9c92f4-cd5c-4917-8ce8-5619892d5470\") " pod="openstack/watcher-applier-0" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.965142 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7nmt\" (UniqueName: \"kubernetes.io/projected/7a9c92f4-cd5c-4917-8ce8-5619892d5470-kube-api-access-j7nmt\") pod \"watcher-applier-0\" (UID: \"7a9c92f4-cd5c-4917-8ce8-5619892d5470\") " pod="openstack/watcher-applier-0" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.966451 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a9c92f4-cd5c-4917-8ce8-5619892d5470-logs\") pod \"watcher-applier-0\" (UID: \"7a9c92f4-cd5c-4917-8ce8-5619892d5470\") " pod="openstack/watcher-applier-0" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.970866 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a9c92f4-cd5c-4917-8ce8-5619892d5470-config-data\") pod \"watcher-applier-0\" (UID: \"7a9c92f4-cd5c-4917-8ce8-5619892d5470\") " pod="openstack/watcher-applier-0" Oct 03 18:32:55 crc kubenswrapper[4835]: I1003 18:32:55.971628 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a9c92f4-cd5c-4917-8ce8-5619892d5470-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"7a9c92f4-cd5c-4917-8ce8-5619892d5470\") " pod="openstack/watcher-applier-0" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.007242 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7nmt\" (UniqueName: \"kubernetes.io/projected/7a9c92f4-cd5c-4917-8ce8-5619892d5470-kube-api-access-j7nmt\") pod \"watcher-applier-0\" (UID: \"7a9c92f4-cd5c-4917-8ce8-5619892d5470\") " pod="openstack/watcher-applier-0" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.265628 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.430720 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-9cdbfcc7d-ccpdw"] Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.433005 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.436404 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.438558 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.451283 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9cdbfcc7d-ccpdw"] Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.583605 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-internal-tls-certs\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.583914 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-config-data\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.583932 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-public-tls-certs\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.583975 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jngjj\" (UniqueName: \"kubernetes.io/projected/204b3b60-3ae4-4915-8810-3423d4308efb-kube-api-access-jngjj\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.584027 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-combined-ca-bundle\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.584090 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-config-data-custom\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.589532 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204b3b60-3ae4-4915-8810-3423d4308efb-logs\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.691114 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204b3b60-3ae4-4915-8810-3423d4308efb-logs\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.691641 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-internal-tls-certs\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.691731 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-config-data\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.691758 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-public-tls-certs\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.691810 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jngjj\" (UniqueName: \"kubernetes.io/projected/204b3b60-3ae4-4915-8810-3423d4308efb-kube-api-access-jngjj\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.691875 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-combined-ca-bundle\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.691946 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-config-data-custom\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.702314 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/204b3b60-3ae4-4915-8810-3423d4308efb-logs\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.710768 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-internal-tls-certs\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.719852 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-combined-ca-bundle\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.720768 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-config-data\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.722126 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-config-data-custom\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.729571 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/204b3b60-3ae4-4915-8810-3423d4308efb-public-tls-certs\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.730593 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jngjj\" (UniqueName: \"kubernetes.io/projected/204b3b60-3ae4-4915-8810-3423d4308efb-kube-api-access-jngjj\") pod \"barbican-api-9cdbfcc7d-ccpdw\" (UID: \"204b3b60-3ae4-4915-8810-3423d4308efb\") " pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.758690 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.808895 4835 generic.go:334] "Generic (PLEG): container finished" podID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerID="e806a91c1d0ed6742cb427f345aabf9762384e27525ea6c3dea58c86eb291aac" exitCode=1 Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.808999 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8110d0e5-9e19-4306-b8aa-babe937e8d2a","Type":"ContainerDied","Data":"e806a91c1d0ed6742cb427f345aabf9762384e27525ea6c3dea58c86eb291aac"} Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.809032 4835 scope.go:117] "RemoveContainer" containerID="de105950d6ac7e7156a461005afa919948679bf395b58c050488f8cb4863a6cf" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.809679 4835 scope.go:117] "RemoveContainer" containerID="e806a91c1d0ed6742cb427f345aabf9762384e27525ea6c3dea58c86eb291aac" Oct 03 18:32:56 crc kubenswrapper[4835]: E1003 18:32:56.809941 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8110d0e5-9e19-4306-b8aa-babe937e8d2a)\"" pod="openstack/watcher-decision-engine-0" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.814605 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dpk2w" event={"ID":"705966b1-0d0b-4c12-9cc1-830277fcf80c","Type":"ContainerStarted","Data":"6ca2f11f0cbfad15130861cd6db25151abca97e492dafa18163bd5cbdc425bba"} Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.829085 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" event={"ID":"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01","Type":"ContainerStarted","Data":"bf152e40061f253da96c8f28442adf39bf245f9ead88ea848009341bbc55e28f"} Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.829222 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.829255 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.829332 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.829344 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.860670 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" podStartSLOduration=4.860651537 podStartE2EDuration="4.860651537s" podCreationTimestamp="2025-10-03 18:32:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:56.85746187 +0000 UTC m=+1118.573402752" watchObservedRunningTime="2025-10-03 18:32:56.860651537 +0000 UTC m=+1118.576592409" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.887473 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-dpk2w" podStartSLOduration=4.505172454 podStartE2EDuration="49.887450729s" podCreationTimestamp="2025-10-03 18:32:07 +0000 UTC" firstStartedPulling="2025-10-03 18:32:09.679710529 +0000 UTC m=+1071.395651401" lastFinishedPulling="2025-10-03 18:32:55.061988804 +0000 UTC m=+1116.777929676" observedRunningTime="2025-10-03 18:32:56.880735955 +0000 UTC m=+1118.596676837" watchObservedRunningTime="2025-10-03 18:32:56.887450729 +0000 UTC m=+1118.603391601" Oct 03 18:32:56 crc kubenswrapper[4835]: I1003 18:32:56.912614 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4870398d-de86-4dc6-9052-b6e80bfe27f5" path="/var/lib/kubelet/pods/4870398d-de86-4dc6-9052-b6e80bfe27f5/volumes" Oct 03 18:32:57 crc kubenswrapper[4835]: I1003 18:32:57.611865 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 18:32:57 crc kubenswrapper[4835]: I1003 18:32:57.614868 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 18:32:57 crc kubenswrapper[4835]: I1003 18:32:57.743982 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:57 crc kubenswrapper[4835]: I1003 18:32:57.833250 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 18:32:57 crc kubenswrapper[4835]: I1003 18:32:57.838649 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:32:58 crc kubenswrapper[4835]: I1003 18:32:58.166552 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 18:32:58 crc kubenswrapper[4835]: I1003 18:32:58.166976 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 18:32:58 crc kubenswrapper[4835]: I1003 18:32:58.167891 4835 scope.go:117] "RemoveContainer" containerID="e806a91c1d0ed6742cb427f345aabf9762384e27525ea6c3dea58c86eb291aac" Oct 03 18:32:58 crc kubenswrapper[4835]: E1003 18:32:58.168434 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8110d0e5-9e19-4306-b8aa-babe937e8d2a)\"" pod="openstack/watcher-decision-engine-0" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" Oct 03 18:32:58 crc kubenswrapper[4835]: I1003 18:32:58.529903 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9cdbfcc7d-ccpdw"] Oct 03 18:32:58 crc kubenswrapper[4835]: W1003 18:32:58.560224 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod204b3b60_3ae4_4915_8810_3423d4308efb.slice/crio-b9e34013af969ec80b41421b338b3ae626e64e95bfb1afb2d0b3554d2a0abfa1 WatchSource:0}: Error finding container b9e34013af969ec80b41421b338b3ae626e64e95bfb1afb2d0b3554d2a0abfa1: Status 404 returned error can't find the container with id b9e34013af969ec80b41421b338b3ae626e64e95bfb1afb2d0b3554d2a0abfa1 Oct 03 18:32:58 crc kubenswrapper[4835]: I1003 18:32:58.562518 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 03 18:32:58 crc kubenswrapper[4835]: I1003 18:32:58.858212 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" event={"ID":"42897a96-1d94-485f-9448-792d48138492","Type":"ContainerStarted","Data":"b4c353ed931a7960d43b80f9e7ab4f50857b4d36e14361901d8a519faf036d18"} Oct 03 18:32:58 crc kubenswrapper[4835]: I1003 18:32:58.860859 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-649c499755-ttlj6" event={"ID":"9343cae5-9ae1-4cae-b5a1-31acc9b34217","Type":"ContainerStarted","Data":"6bdcb8deee543adad2db062469283a122cb152a153cec1da1b6dd6a771804068"} Oct 03 18:32:58 crc kubenswrapper[4835]: I1003 18:32:58.922666 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9cdbfcc7d-ccpdw" event={"ID":"204b3b60-3ae4-4915-8810-3423d4308efb","Type":"ContainerStarted","Data":"b9e34013af969ec80b41421b338b3ae626e64e95bfb1afb2d0b3554d2a0abfa1"} Oct 03 18:32:58 crc kubenswrapper[4835]: I1003 18:32:58.922711 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"7a9c92f4-cd5c-4917-8ce8-5619892d5470","Type":"ContainerStarted","Data":"a014a321f5dfa3f02eef23dfd76c135afe4afdbd37ca71cad8462ef6934222db"} Oct 03 18:32:59 crc kubenswrapper[4835]: I1003 18:32:59.619371 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:32:59 crc kubenswrapper[4835]: I1003 18:32:59.621742 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="352d9656-3048-419a-9b51-c9e01d349bc6" containerName="watcher-api-log" containerID="cri-o://ea3368f2eb11d58090f8794af54d5b598bfca76a5de102607e9bf4f2204c1ca2" gracePeriod=30 Oct 03 18:32:59 crc kubenswrapper[4835]: I1003 18:32:59.622247 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="352d9656-3048-419a-9b51-c9e01d349bc6" containerName="watcher-api" containerID="cri-o://e6fe85f3bc5b3acea795f802bf4cfe16437a2b4a2188b9cd022d79cf731a0899" gracePeriod=30 Oct 03 18:32:59 crc kubenswrapper[4835]: I1003 18:32:59.917229 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" event={"ID":"42897a96-1d94-485f-9448-792d48138492","Type":"ContainerStarted","Data":"5b8dcde3b1e14bf3c0299f4d2c4bdb87d91739cd0cc1708c84cd1b640a9ceaa7"} Oct 03 18:32:59 crc kubenswrapper[4835]: I1003 18:32:59.941181 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-649c499755-ttlj6" event={"ID":"9343cae5-9ae1-4cae-b5a1-31acc9b34217","Type":"ContainerStarted","Data":"8b5394f1cadc763c8eb907db1d7c879f9f922c85b05837c240362544fc354eb8"} Oct 03 18:32:59 crc kubenswrapper[4835]: I1003 18:32:59.946063 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9cdbfcc7d-ccpdw" event={"ID":"204b3b60-3ae4-4915-8810-3423d4308efb","Type":"ContainerStarted","Data":"88ab7938adc4be42cdacf47be325b6511fe07749340eab516aa4c73ac4342eaa"} Oct 03 18:32:59 crc kubenswrapper[4835]: I1003 18:32:59.946154 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9cdbfcc7d-ccpdw" event={"ID":"204b3b60-3ae4-4915-8810-3423d4308efb","Type":"ContainerStarted","Data":"5f1d61be6598982cb80d698ff5162588c3f954bc47233088213683e10b55f63a"} Oct 03 18:32:59 crc kubenswrapper[4835]: I1003 18:32:59.946805 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:59 crc kubenswrapper[4835]: I1003 18:32:59.946834 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:32:59 crc kubenswrapper[4835]: I1003 18:32:59.956006 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8585b7f888-6wk2b" podStartSLOduration=3.821213903 podStartE2EDuration="7.955984332s" podCreationTimestamp="2025-10-03 18:32:52 +0000 UTC" firstStartedPulling="2025-10-03 18:32:53.995647565 +0000 UTC m=+1115.711588437" lastFinishedPulling="2025-10-03 18:32:58.130417994 +0000 UTC m=+1119.846358866" observedRunningTime="2025-10-03 18:32:59.95265909 +0000 UTC m=+1121.668599962" watchObservedRunningTime="2025-10-03 18:32:59.955984332 +0000 UTC m=+1121.671925204" Oct 03 18:32:59 crc kubenswrapper[4835]: I1003 18:32:59.999024 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-9cdbfcc7d-ccpdw" podStartSLOduration=3.999002897 podStartE2EDuration="3.999002897s" podCreationTimestamp="2025-10-03 18:32:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:32:59.976684294 +0000 UTC m=+1121.692625166" watchObservedRunningTime="2025-10-03 18:32:59.999002897 +0000 UTC m=+1121.714943769" Oct 03 18:33:00 crc kubenswrapper[4835]: I1003 18:33:00.016162 4835 generic.go:334] "Generic (PLEG): container finished" podID="352d9656-3048-419a-9b51-c9e01d349bc6" containerID="ea3368f2eb11d58090f8794af54d5b598bfca76a5de102607e9bf4f2204c1ca2" exitCode=143 Oct 03 18:33:00 crc kubenswrapper[4835]: I1003 18:33:00.016240 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"352d9656-3048-419a-9b51-c9e01d349bc6","Type":"ContainerDied","Data":"ea3368f2eb11d58090f8794af54d5b598bfca76a5de102607e9bf4f2204c1ca2"} Oct 03 18:33:00 crc kubenswrapper[4835]: I1003 18:33:00.036241 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-649c499755-ttlj6" podStartSLOduration=3.584610578 podStartE2EDuration="8.036222883s" podCreationTimestamp="2025-10-03 18:32:52 +0000 UTC" firstStartedPulling="2025-10-03 18:32:53.662928882 +0000 UTC m=+1115.378869754" lastFinishedPulling="2025-10-03 18:32:58.114541187 +0000 UTC m=+1119.830482059" observedRunningTime="2025-10-03 18:33:00.012064366 +0000 UTC m=+1121.728005238" watchObservedRunningTime="2025-10-03 18:33:00.036222883 +0000 UTC m=+1121.752163755" Oct 03 18:33:00 crc kubenswrapper[4835]: I1003 18:33:00.083299 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"7a9c92f4-cd5c-4917-8ce8-5619892d5470","Type":"ContainerStarted","Data":"3178671e78bf6e6e52af9cabb84dcfeb6b91c0cedd8fd956dec29fe5e5a65f7b"} Oct 03 18:33:00 crc kubenswrapper[4835]: I1003 18:33:00.117623 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=5.117601853 podStartE2EDuration="5.117601853s" podCreationTimestamp="2025-10-03 18:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:33:00.102981997 +0000 UTC m=+1121.818922859" watchObservedRunningTime="2025-10-03 18:33:00.117601853 +0000 UTC m=+1121.833542725" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.063002 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.107578 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-combined-ca-bundle\") pod \"352d9656-3048-419a-9b51-c9e01d349bc6\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.107662 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/352d9656-3048-419a-9b51-c9e01d349bc6-logs\") pod \"352d9656-3048-419a-9b51-c9e01d349bc6\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.107778 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k66x8\" (UniqueName: \"kubernetes.io/projected/352d9656-3048-419a-9b51-c9e01d349bc6-kube-api-access-k66x8\") pod \"352d9656-3048-419a-9b51-c9e01d349bc6\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.107875 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-custom-prometheus-ca\") pod \"352d9656-3048-419a-9b51-c9e01d349bc6\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.107927 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-config-data\") pod \"352d9656-3048-419a-9b51-c9e01d349bc6\" (UID: \"352d9656-3048-419a-9b51-c9e01d349bc6\") " Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.109656 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/352d9656-3048-419a-9b51-c9e01d349bc6-logs" (OuterVolumeSpecName: "logs") pod "352d9656-3048-419a-9b51-c9e01d349bc6" (UID: "352d9656-3048-419a-9b51-c9e01d349bc6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.115475 4835 generic.go:334] "Generic (PLEG): container finished" podID="352d9656-3048-419a-9b51-c9e01d349bc6" containerID="e6fe85f3bc5b3acea795f802bf4cfe16437a2b4a2188b9cd022d79cf731a0899" exitCode=0 Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.115745 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"352d9656-3048-419a-9b51-c9e01d349bc6","Type":"ContainerDied","Data":"e6fe85f3bc5b3acea795f802bf4cfe16437a2b4a2188b9cd022d79cf731a0899"} Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.115777 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"352d9656-3048-419a-9b51-c9e01d349bc6","Type":"ContainerDied","Data":"9341b386bc6205d5c954c3e318fe1e760b695dc92b9855d56d617fdfccbb7efe"} Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.115783 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.115792 4835 scope.go:117] "RemoveContainer" containerID="e6fe85f3bc5b3acea795f802bf4cfe16437a2b4a2188b9cd022d79cf731a0899" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.136490 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/352d9656-3048-419a-9b51-c9e01d349bc6-kube-api-access-k66x8" (OuterVolumeSpecName: "kube-api-access-k66x8") pod "352d9656-3048-419a-9b51-c9e01d349bc6" (UID: "352d9656-3048-419a-9b51-c9e01d349bc6"). InnerVolumeSpecName "kube-api-access-k66x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.138194 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "352d9656-3048-419a-9b51-c9e01d349bc6" (UID: "352d9656-3048-419a-9b51-c9e01d349bc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.155228 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "352d9656-3048-419a-9b51-c9e01d349bc6" (UID: "352d9656-3048-419a-9b51-c9e01d349bc6"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.185218 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-config-data" (OuterVolumeSpecName: "config-data") pod "352d9656-3048-419a-9b51-c9e01d349bc6" (UID: "352d9656-3048-419a-9b51-c9e01d349bc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.209855 4835 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.209882 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.209891 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352d9656-3048-419a-9b51-c9e01d349bc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.209900 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/352d9656-3048-419a-9b51-c9e01d349bc6-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.209909 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k66x8\" (UniqueName: \"kubernetes.io/projected/352d9656-3048-419a-9b51-c9e01d349bc6-kube-api-access-k66x8\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.266869 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.454163 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.463456 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.479908 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:33:01 crc kubenswrapper[4835]: E1003 18:33:01.480316 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352d9656-3048-419a-9b51-c9e01d349bc6" containerName="watcher-api-log" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.480334 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="352d9656-3048-419a-9b51-c9e01d349bc6" containerName="watcher-api-log" Oct 03 18:33:01 crc kubenswrapper[4835]: E1003 18:33:01.480353 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352d9656-3048-419a-9b51-c9e01d349bc6" containerName="watcher-api" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.480360 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="352d9656-3048-419a-9b51-c9e01d349bc6" containerName="watcher-api" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.480532 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="352d9656-3048-419a-9b51-c9e01d349bc6" containerName="watcher-api-log" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.480565 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="352d9656-3048-419a-9b51-c9e01d349bc6" containerName="watcher-api" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.487945 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.507403 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.507658 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.507899 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.517878 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.517985 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.518029 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.518061 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-public-tls-certs\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.518153 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr5dk\" (UniqueName: \"kubernetes.io/projected/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-kube-api-access-nr5dk\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.518246 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-config-data\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.518278 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-logs\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.540179 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.620742 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr5dk\" (UniqueName: \"kubernetes.io/projected/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-kube-api-access-nr5dk\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.620847 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-config-data\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.620868 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-logs\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.620889 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.620943 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.620965 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.620985 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-public-tls-certs\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.621473 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-logs\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.625608 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.626019 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.626048 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.626633 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-public-tls-certs\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.631675 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-config-data\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.636291 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr5dk\" (UniqueName: \"kubernetes.io/projected/0aa3d7ce-c1f2-40a5-b63b-b39daee108fb-kube-api-access-nr5dk\") pod \"watcher-api-0\" (UID: \"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb\") " pod="openstack/watcher-api-0" Oct 03 18:33:01 crc kubenswrapper[4835]: I1003 18:33:01.816037 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 18:33:02 crc kubenswrapper[4835]: I1003 18:33:02.888823 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="352d9656-3048-419a-9b51-c9e01d349bc6" path="/var/lib/kubelet/pods/352d9656-3048-419a-9b51-c9e01d349bc6/volumes" Oct 03 18:33:02 crc kubenswrapper[4835]: I1003 18:33:02.965899 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:33:02 crc kubenswrapper[4835]: I1003 18:33:02.989806 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-84859df966-b4t26" Oct 03 18:33:03 crc kubenswrapper[4835]: I1003 18:33:03.345217 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:33:03 crc kubenswrapper[4835]: I1003 18:33:03.419650 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64bb4dc8df-rxs4r"] Oct 03 18:33:03 crc kubenswrapper[4835]: I1003 18:33:03.419896 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" podUID="9ed367e4-c09b-46d6-82d0-f43eb6c4417d" containerName="dnsmasq-dns" containerID="cri-o://a4ce161e93558d68738f93dcb9b50888432950540aa7d4afb1f088b99380846f" gracePeriod=10 Oct 03 18:33:04 crc kubenswrapper[4835]: I1003 18:33:04.157926 4835 generic.go:334] "Generic (PLEG): container finished" podID="9ed367e4-c09b-46d6-82d0-f43eb6c4417d" containerID="a4ce161e93558d68738f93dcb9b50888432950540aa7d4afb1f088b99380846f" exitCode=0 Oct 03 18:33:04 crc kubenswrapper[4835]: I1003 18:33:04.158113 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" event={"ID":"9ed367e4-c09b-46d6-82d0-f43eb6c4417d","Type":"ContainerDied","Data":"a4ce161e93558d68738f93dcb9b50888432950540aa7d4afb1f088b99380846f"} Oct 03 18:33:04 crc kubenswrapper[4835]: I1003 18:33:04.959734 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:33:05 crc kubenswrapper[4835]: I1003 18:33:05.085406 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:33:05 crc kubenswrapper[4835]: I1003 18:33:05.133190 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-84859df966-b4t26" Oct 03 18:33:05 crc kubenswrapper[4835]: I1003 18:33:05.139116 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:33:05 crc kubenswrapper[4835]: I1003 18:33:05.197847 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64dcfd48b6-tpcpd"] Oct 03 18:33:05 crc kubenswrapper[4835]: I1003 18:33:05.198038 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64dcfd48b6-tpcpd" podUID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerName="horizon-log" containerID="cri-o://c98cca13a795e8c49418d2713ce1f52d7c5add2a8a45469c7d0d5ca2a4207bec" gracePeriod=30 Oct 03 18:33:05 crc kubenswrapper[4835]: I1003 18:33:05.198440 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64dcfd48b6-tpcpd" podUID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerName="horizon" containerID="cri-o://83e641e13eea890082718c9960c998038c38e8137ea51f5b598f75b184ca52c4" gracePeriod=30 Oct 03 18:33:05 crc kubenswrapper[4835]: I1003 18:33:05.359433 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:33:05 crc kubenswrapper[4835]: I1003 18:33:05.359521 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.183143 4835 generic.go:334] "Generic (PLEG): container finished" podID="705966b1-0d0b-4c12-9cc1-830277fcf80c" containerID="6ca2f11f0cbfad15130861cd6db25151abca97e492dafa18163bd5cbdc425bba" exitCode=0 Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.183222 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dpk2w" event={"ID":"705966b1-0d0b-4c12-9cc1-830277fcf80c","Type":"ContainerDied","Data":"6ca2f11f0cbfad15130861cd6db25151abca97e492dafa18163bd5cbdc425bba"} Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.186551 4835 generic.go:334] "Generic (PLEG): container finished" podID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerID="83e641e13eea890082718c9960c998038c38e8137ea51f5b598f75b184ca52c4" exitCode=0 Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.186596 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64dcfd48b6-tpcpd" event={"ID":"de5d465a-f009-4cef-940e-3b2aaa64468b","Type":"ContainerDied","Data":"83e641e13eea890082718c9960c998038c38e8137ea51f5b598f75b184ca52c4"} Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.266992 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.297712 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.345618 4835 scope.go:117] "RemoveContainer" containerID="ea3368f2eb11d58090f8794af54d5b598bfca76a5de102607e9bf4f2204c1ca2" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.454702 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.505277 4835 scope.go:117] "RemoveContainer" containerID="e6fe85f3bc5b3acea795f802bf4cfe16437a2b4a2188b9cd022d79cf731a0899" Oct 03 18:33:06 crc kubenswrapper[4835]: E1003 18:33:06.506942 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6fe85f3bc5b3acea795f802bf4cfe16437a2b4a2188b9cd022d79cf731a0899\": container with ID starting with e6fe85f3bc5b3acea795f802bf4cfe16437a2b4a2188b9cd022d79cf731a0899 not found: ID does not exist" containerID="e6fe85f3bc5b3acea795f802bf4cfe16437a2b4a2188b9cd022d79cf731a0899" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.506974 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6fe85f3bc5b3acea795f802bf4cfe16437a2b4a2188b9cd022d79cf731a0899"} err="failed to get container status \"e6fe85f3bc5b3acea795f802bf4cfe16437a2b4a2188b9cd022d79cf731a0899\": rpc error: code = NotFound desc = could not find container \"e6fe85f3bc5b3acea795f802bf4cfe16437a2b4a2188b9cd022d79cf731a0899\": container with ID starting with e6fe85f3bc5b3acea795f802bf4cfe16437a2b4a2188b9cd022d79cf731a0899 not found: ID does not exist" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.506997 4835 scope.go:117] "RemoveContainer" containerID="ea3368f2eb11d58090f8794af54d5b598bfca76a5de102607e9bf4f2204c1ca2" Oct 03 18:33:06 crc kubenswrapper[4835]: E1003 18:33:06.507980 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea3368f2eb11d58090f8794af54d5b598bfca76a5de102607e9bf4f2204c1ca2\": container with ID starting with ea3368f2eb11d58090f8794af54d5b598bfca76a5de102607e9bf4f2204c1ca2 not found: ID does not exist" containerID="ea3368f2eb11d58090f8794af54d5b598bfca76a5de102607e9bf4f2204c1ca2" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.508007 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea3368f2eb11d58090f8794af54d5b598bfca76a5de102607e9bf4f2204c1ca2"} err="failed to get container status \"ea3368f2eb11d58090f8794af54d5b598bfca76a5de102607e9bf4f2204c1ca2\": rpc error: code = NotFound desc = could not find container \"ea3368f2eb11d58090f8794af54d5b598bfca76a5de102607e9bf4f2204c1ca2\": container with ID starting with ea3368f2eb11d58090f8794af54d5b598bfca76a5de102607e9bf4f2204c1ca2 not found: ID does not exist" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.520163 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-config\") pod \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.520289 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-dns-swift-storage-0\") pod \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.520329 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-ovsdbserver-nb\") pod \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.520377 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-ovsdbserver-sb\") pod \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.520448 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9jnr\" (UniqueName: \"kubernetes.io/projected/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-kube-api-access-x9jnr\") pod \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.520466 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-dns-svc\") pod \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\" (UID: \"9ed367e4-c09b-46d6-82d0-f43eb6c4417d\") " Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.555638 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-kube-api-access-x9jnr" (OuterVolumeSpecName: "kube-api-access-x9jnr") pod "9ed367e4-c09b-46d6-82d0-f43eb6c4417d" (UID: "9ed367e4-c09b-46d6-82d0-f43eb6c4417d"). InnerVolumeSpecName "kube-api-access-x9jnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.617855 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9ed367e4-c09b-46d6-82d0-f43eb6c4417d" (UID: "9ed367e4-c09b-46d6-82d0-f43eb6c4417d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.623807 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9jnr\" (UniqueName: \"kubernetes.io/projected/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-kube-api-access-x9jnr\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.623831 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.629309 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ed367e4-c09b-46d6-82d0-f43eb6c4417d" (UID: "9ed367e4-c09b-46d6-82d0-f43eb6c4417d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.636952 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-config" (OuterVolumeSpecName: "config") pod "9ed367e4-c09b-46d6-82d0-f43eb6c4417d" (UID: "9ed367e4-c09b-46d6-82d0-f43eb6c4417d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.637146 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9ed367e4-c09b-46d6-82d0-f43eb6c4417d" (UID: "9ed367e4-c09b-46d6-82d0-f43eb6c4417d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.650641 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9ed367e4-c09b-46d6-82d0-f43eb6c4417d" (UID: "9ed367e4-c09b-46d6-82d0-f43eb6c4417d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.725137 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.725417 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.725428 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:06 crc kubenswrapper[4835]: I1003 18:33:06.725438 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed367e4-c09b-46d6-82d0-f43eb6c4417d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.089994 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.198351 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb","Type":"ContainerStarted","Data":"1230df7dd59e7c84ea58703fda2acae116c7aa8305b2fe96fd75dbb844e8eb38"} Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.201097 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af75f57a-7612-48c8-b3fb-8594e81e2d0a","Type":"ContainerStarted","Data":"c65996a826cd8ebb9934d2351812b1e05bd9a0d4ea0f1e45173803e42601ea5a"} Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.201166 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="ceilometer-central-agent" containerID="cri-o://6af7b4aca2ab42921f496c3018bb159e5b3950431d0b3ea9aebf155918194167" gracePeriod=30 Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.201218 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.201238 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="sg-core" containerID="cri-o://2d8b620a897b5436e95f47f581862450b09d7ca2e085f89555d7329fe76e3fa1" gracePeriod=30 Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.201275 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="ceilometer-notification-agent" containerID="cri-o://64b3d992f1132cf7ecff87b380648189301d52c52ecdb56ec440407b04213ed6" gracePeriod=30 Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.201232 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="proxy-httpd" containerID="cri-o://c65996a826cd8ebb9934d2351812b1e05bd9a0d4ea0f1e45173803e42601ea5a" gracePeriod=30 Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.227988 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" event={"ID":"9ed367e4-c09b-46d6-82d0-f43eb6c4417d","Type":"ContainerDied","Data":"fcb0c806b985135c3ee20910e7b7d832f0edcf3d32755b4a74cd0dd7134c7936"} Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.228055 4835 scope.go:117] "RemoveContainer" containerID="a4ce161e93558d68738f93dcb9b50888432950540aa7d4afb1f088b99380846f" Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.228168 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.229561 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.359566487 podStartE2EDuration="1m0.229545701s" podCreationTimestamp="2025-10-03 18:32:07 +0000 UTC" firstStartedPulling="2025-10-03 18:32:10.635252798 +0000 UTC m=+1072.351193670" lastFinishedPulling="2025-10-03 18:33:06.505232012 +0000 UTC m=+1128.221172884" observedRunningTime="2025-10-03 18:33:07.222288975 +0000 UTC m=+1128.938229847" watchObservedRunningTime="2025-10-03 18:33:07.229545701 +0000 UTC m=+1128.945486573" Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.257857 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64bb4dc8df-rxs4r"] Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.260537 4835 scope.go:117] "RemoveContainer" containerID="9f2a2f8c71eed05b1a59bad13872c4d4edff6fed510baf8713ceb93e74c0fbb5" Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.272520 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64bb4dc8df-rxs4r"] Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.282369 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.781149 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.947395 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-combined-ca-bundle\") pod \"705966b1-0d0b-4c12-9cc1-830277fcf80c\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.947584 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-config-data\") pod \"705966b1-0d0b-4c12-9cc1-830277fcf80c\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.947634 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22cdq\" (UniqueName: \"kubernetes.io/projected/705966b1-0d0b-4c12-9cc1-830277fcf80c-kube-api-access-22cdq\") pod \"705966b1-0d0b-4c12-9cc1-830277fcf80c\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.947678 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-scripts\") pod \"705966b1-0d0b-4c12-9cc1-830277fcf80c\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.947702 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-db-sync-config-data\") pod \"705966b1-0d0b-4c12-9cc1-830277fcf80c\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.947722 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/705966b1-0d0b-4c12-9cc1-830277fcf80c-etc-machine-id\") pod \"705966b1-0d0b-4c12-9cc1-830277fcf80c\" (UID: \"705966b1-0d0b-4c12-9cc1-830277fcf80c\") " Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.948098 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/705966b1-0d0b-4c12-9cc1-830277fcf80c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "705966b1-0d0b-4c12-9cc1-830277fcf80c" (UID: "705966b1-0d0b-4c12-9cc1-830277fcf80c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.948535 4835 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/705966b1-0d0b-4c12-9cc1-830277fcf80c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.952108 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/705966b1-0d0b-4c12-9cc1-830277fcf80c-kube-api-access-22cdq" (OuterVolumeSpecName: "kube-api-access-22cdq") pod "705966b1-0d0b-4c12-9cc1-830277fcf80c" (UID: "705966b1-0d0b-4c12-9cc1-830277fcf80c"). InnerVolumeSpecName "kube-api-access-22cdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.952301 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-scripts" (OuterVolumeSpecName: "scripts") pod "705966b1-0d0b-4c12-9cc1-830277fcf80c" (UID: "705966b1-0d0b-4c12-9cc1-830277fcf80c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.952509 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "705966b1-0d0b-4c12-9cc1-830277fcf80c" (UID: "705966b1-0d0b-4c12-9cc1-830277fcf80c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:07 crc kubenswrapper[4835]: I1003 18:33:07.981084 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "705966b1-0d0b-4c12-9cc1-830277fcf80c" (UID: "705966b1-0d0b-4c12-9cc1-830277fcf80c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.007727 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-config-data" (OuterVolumeSpecName: "config-data") pod "705966b1-0d0b-4c12-9cc1-830277fcf80c" (UID: "705966b1-0d0b-4c12-9cc1-830277fcf80c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.051627 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.051653 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22cdq\" (UniqueName: \"kubernetes.io/projected/705966b1-0d0b-4c12-9cc1-830277fcf80c-kube-api-access-22cdq\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.051665 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.051675 4835 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.051683 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/705966b1-0d0b-4c12-9cc1-830277fcf80c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.166398 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.166451 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.167099 4835 scope.go:117] "RemoveContainer" containerID="e806a91c1d0ed6742cb427f345aabf9762384e27525ea6c3dea58c86eb291aac" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.238899 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dpk2w" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.238908 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dpk2w" event={"ID":"705966b1-0d0b-4c12-9cc1-830277fcf80c","Type":"ContainerDied","Data":"9b60c38e579d20045d1c1ef8eb393212f20806911c820f169e411a415e656eb0"} Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.238970 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b60c38e579d20045d1c1ef8eb393212f20806911c820f169e411a415e656eb0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.240608 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb","Type":"ContainerStarted","Data":"dfdd567b54cec1e2a68e497043cc19ed56399fd5c0e540c8ccc69b242bb6f5ce"} Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.240633 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"0aa3d7ce-c1f2-40a5-b63b-b39daee108fb","Type":"ContainerStarted","Data":"22ce6b22e3131423f5efe15692f0a02d146aed9517f1684b933d11904b490819"} Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.241831 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.248123 4835 generic.go:334] "Generic (PLEG): container finished" podID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerID="c65996a826cd8ebb9934d2351812b1e05bd9a0d4ea0f1e45173803e42601ea5a" exitCode=0 Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.248149 4835 generic.go:334] "Generic (PLEG): container finished" podID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerID="2d8b620a897b5436e95f47f581862450b09d7ca2e085f89555d7329fe76e3fa1" exitCode=2 Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.248157 4835 generic.go:334] "Generic (PLEG): container finished" podID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerID="6af7b4aca2ab42921f496c3018bb159e5b3950431d0b3ea9aebf155918194167" exitCode=0 Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.248203 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af75f57a-7612-48c8-b3fb-8594e81e2d0a","Type":"ContainerDied","Data":"c65996a826cd8ebb9934d2351812b1e05bd9a0d4ea0f1e45173803e42601ea5a"} Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.248227 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af75f57a-7612-48c8-b3fb-8594e81e2d0a","Type":"ContainerDied","Data":"2d8b620a897b5436e95f47f581862450b09d7ca2e085f89555d7329fe76e3fa1"} Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.248237 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af75f57a-7612-48c8-b3fb-8594e81e2d0a","Type":"ContainerDied","Data":"6af7b4aca2ab42921f496c3018bb159e5b3950431d0b3ea9aebf155918194167"} Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.268244 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=7.268196767 podStartE2EDuration="7.268196767s" podCreationTimestamp="2025-10-03 18:33:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:33:08.261658358 +0000 UTC m=+1129.977599250" watchObservedRunningTime="2025-10-03 18:33:08.268196767 +0000 UTC m=+1129.984137659" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.440815 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 18:33:08 crc kubenswrapper[4835]: E1003 18:33:08.441513 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed367e4-c09b-46d6-82d0-f43eb6c4417d" containerName="dnsmasq-dns" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.441525 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed367e4-c09b-46d6-82d0-f43eb6c4417d" containerName="dnsmasq-dns" Oct 03 18:33:08 crc kubenswrapper[4835]: E1003 18:33:08.441569 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="705966b1-0d0b-4c12-9cc1-830277fcf80c" containerName="cinder-db-sync" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.441576 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="705966b1-0d0b-4c12-9cc1-830277fcf80c" containerName="cinder-db-sync" Oct 03 18:33:08 crc kubenswrapper[4835]: E1003 18:33:08.441595 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed367e4-c09b-46d6-82d0-f43eb6c4417d" containerName="init" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.441601 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed367e4-c09b-46d6-82d0-f43eb6c4417d" containerName="init" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.441794 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="705966b1-0d0b-4c12-9cc1-830277fcf80c" containerName="cinder-db-sync" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.441819 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed367e4-c09b-46d6-82d0-f43eb6c4417d" containerName="dnsmasq-dns" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.442885 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.447140 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x2whm" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.453440 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.453638 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.453783 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.514140 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.570063 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-config-data\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.570119 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg5s7\" (UniqueName: \"kubernetes.io/projected/ab985863-4ac9-44bc-977a-241abc2c4635-kube-api-access-sg5s7\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.570187 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab985863-4ac9-44bc-977a-241abc2c4635-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.570214 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-scripts\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.570236 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.570255 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.608794 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7544fb748f-gl6nd"] Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.610351 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.643696 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7544fb748f-gl6nd"] Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.671582 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab985863-4ac9-44bc-977a-241abc2c4635-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.671636 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-scripts\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.671661 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.671684 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.671770 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-config-data\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.671790 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg5s7\" (UniqueName: \"kubernetes.io/projected/ab985863-4ac9-44bc-977a-241abc2c4635-kube-api-access-sg5s7\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.672239 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab985863-4ac9-44bc-977a-241abc2c4635-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.682005 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-scripts\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.689860 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-config-data\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.697674 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.707699 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.716712 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg5s7\" (UniqueName: \"kubernetes.io/projected/ab985863-4ac9-44bc-977a-241abc2c4635-kube-api-access-sg5s7\") pod \"cinder-scheduler-0\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.736988 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.738614 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.741399 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.767844 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.773194 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-dns-swift-storage-0\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.773259 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-ovsdbserver-nb\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.773292 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd7pz\" (UniqueName: \"kubernetes.io/projected/a156c146-9589-4de6-8a2e-9c6740a94542-kube-api-access-hd7pz\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.773311 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-config\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.773336 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-ovsdbserver-sb\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.773362 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-dns-svc\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.799028 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.811101 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.847212 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9cdbfcc7d-ccpdw" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.876396 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-config\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.876438 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-ovsdbserver-sb\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.876464 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-logs\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.876480 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-dns-svc\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.876526 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-scripts\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.876581 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2qhj\" (UniqueName: \"kubernetes.io/projected/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-kube-api-access-x2qhj\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.876603 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-config-data-custom\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.876622 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-config-data\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.876656 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.876677 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-dns-swift-storage-0\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.876725 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-ovsdbserver-nb\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.876750 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.876768 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd7pz\" (UniqueName: \"kubernetes.io/projected/a156c146-9589-4de6-8a2e-9c6740a94542-kube-api-access-hd7pz\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.894086 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-ovsdbserver-sb\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.894975 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-dns-swift-storage-0\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.895814 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-ovsdbserver-nb\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.895865 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-config\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.899266 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-dns-svc\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.914243 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ed367e4-c09b-46d6-82d0-f43eb6c4417d" path="/var/lib/kubelet/pods/9ed367e4-c09b-46d6-82d0-f43eb6c4417d/volumes" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.940002 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd7pz\" (UniqueName: \"kubernetes.io/projected/a156c146-9589-4de6-8a2e-9c6740a94542-kube-api-access-hd7pz\") pod \"dnsmasq-dns-7544fb748f-gl6nd\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.950559 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5b45756688-tbfsl"] Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.950763 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5b45756688-tbfsl" podUID="1505cc32-6896-425d-b36b-1b2d3504901b" containerName="barbican-api-log" containerID="cri-o://5260b2300fda23f80796cae4e675c763968e93c918171e583c4a45af8379ec71" gracePeriod=30 Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.951139 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5b45756688-tbfsl" podUID="1505cc32-6896-425d-b36b-1b2d3504901b" containerName="barbican-api" containerID="cri-o://8ae53724fccd3725c727bddb721ac81a7d0297b53ee19fe8f4276c44febb0e65" gracePeriod=30 Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.978405 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-scripts\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.978528 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2qhj\" (UniqueName: \"kubernetes.io/projected/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-kube-api-access-x2qhj\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.978573 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-config-data-custom\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.978601 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-config-data\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.978688 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.978842 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.978919 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-logs\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.979435 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-logs\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.981721 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.990874 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.997523 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-scripts\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:08 crc kubenswrapper[4835]: I1003 18:33:08.997926 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-config-data\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:09 crc kubenswrapper[4835]: I1003 18:33:09.007914 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2qhj\" (UniqueName: \"kubernetes.io/projected/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-kube-api-access-x2qhj\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:09 crc kubenswrapper[4835]: I1003 18:33:09.011185 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-config-data-custom\") pod \"cinder-api-0\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " pod="openstack/cinder-api-0" Oct 03 18:33:09 crc kubenswrapper[4835]: I1003 18:33:09.107430 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 18:33:09 crc kubenswrapper[4835]: I1003 18:33:09.239658 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:09 crc kubenswrapper[4835]: I1003 18:33:09.384961 4835 generic.go:334] "Generic (PLEG): container finished" podID="1505cc32-6896-425d-b36b-1b2d3504901b" containerID="5260b2300fda23f80796cae4e675c763968e93c918171e583c4a45af8379ec71" exitCode=143 Oct 03 18:33:09 crc kubenswrapper[4835]: I1003 18:33:09.385030 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b45756688-tbfsl" event={"ID":"1505cc32-6896-425d-b36b-1b2d3504901b","Type":"ContainerDied","Data":"5260b2300fda23f80796cae4e675c763968e93c918171e583c4a45af8379ec71"} Oct 03 18:33:09 crc kubenswrapper[4835]: I1003 18:33:09.392972 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8110d0e5-9e19-4306-b8aa-babe937e8d2a","Type":"ContainerStarted","Data":"93a43d0e5cc9cf9fc0d017857367c8f6de72ed7cb5c9b28e99b416df23f0b755"} Oct 03 18:33:09 crc kubenswrapper[4835]: I1003 18:33:09.410552 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 18:33:09 crc kubenswrapper[4835]: I1003 18:33:09.697576 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 18:33:09 crc kubenswrapper[4835]: I1003 18:33:09.975200 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7544fb748f-gl6nd"] Oct 03 18:33:10 crc kubenswrapper[4835]: W1003 18:33:10.009220 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda156c146_9589_4de6_8a2e_9c6740a94542.slice/crio-20e836339cf2ff4d474dafc8c728187e0c0a5f51660378abf0837538eb6b0497 WatchSource:0}: Error finding container 20e836339cf2ff4d474dafc8c728187e0c0a5f51660378abf0837538eb6b0497: Status 404 returned error can't find the container with id 20e836339cf2ff4d474dafc8c728187e0c0a5f51660378abf0837538eb6b0497 Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.076887 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-64bb4dc8df-rxs4r" podUID="9ed367e4-c09b-46d6-82d0-f43eb6c4417d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.159:5353: i/o timeout" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.319430 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.420222 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169","Type":"ContainerStarted","Data":"1e3aa2b721de4b85ba2889babf74538322b6f0939ca9322e039299f5f0e61aa1"} Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.423723 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" event={"ID":"a156c146-9589-4de6-8a2e-9c6740a94542","Type":"ContainerStarted","Data":"20e836339cf2ff4d474dafc8c728187e0c0a5f51660378abf0837538eb6b0497"} Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.424998 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ab985863-4ac9-44bc-977a-241abc2c4635","Type":"ContainerStarted","Data":"3c40559d361821195be188df0e526a4ef8b23401f622d98ae4729d1314bd6c97"} Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.427956 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af75f57a-7612-48c8-b3fb-8594e81e2d0a-run-httpd\") pod \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.427999 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-combined-ca-bundle\") pod \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.428030 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfv8r\" (UniqueName: \"kubernetes.io/projected/af75f57a-7612-48c8-b3fb-8594e81e2d0a-kube-api-access-xfv8r\") pod \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.428084 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-scripts\") pod \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.428124 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-config-data\") pod \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.428145 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-sg-core-conf-yaml\") pod \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.428216 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af75f57a-7612-48c8-b3fb-8594e81e2d0a-log-httpd\") pod \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\" (UID: \"af75f57a-7612-48c8-b3fb-8594e81e2d0a\") " Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.429955 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af75f57a-7612-48c8-b3fb-8594e81e2d0a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "af75f57a-7612-48c8-b3fb-8594e81e2d0a" (UID: "af75f57a-7612-48c8-b3fb-8594e81e2d0a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.432288 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af75f57a-7612-48c8-b3fb-8594e81e2d0a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "af75f57a-7612-48c8-b3fb-8594e81e2d0a" (UID: "af75f57a-7612-48c8-b3fb-8594e81e2d0a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.436223 4835 generic.go:334] "Generic (PLEG): container finished" podID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerID="64b3d992f1132cf7ecff87b380648189301d52c52ecdb56ec440407b04213ed6" exitCode=0 Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.436367 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.436694 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.439941 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-scripts" (OuterVolumeSpecName: "scripts") pod "af75f57a-7612-48c8-b3fb-8594e81e2d0a" (UID: "af75f57a-7612-48c8-b3fb-8594e81e2d0a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.440001 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af75f57a-7612-48c8-b3fb-8594e81e2d0a","Type":"ContainerDied","Data":"64b3d992f1132cf7ecff87b380648189301d52c52ecdb56ec440407b04213ed6"} Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.440033 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"af75f57a-7612-48c8-b3fb-8594e81e2d0a","Type":"ContainerDied","Data":"8acfdc035a7402aa4159aedbcb018a19baacd856fa0e00bc4fe893c372ffeeaf"} Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.440050 4835 scope.go:117] "RemoveContainer" containerID="c65996a826cd8ebb9934d2351812b1e05bd9a0d4ea0f1e45173803e42601ea5a" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.449148 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af75f57a-7612-48c8-b3fb-8594e81e2d0a-kube-api-access-xfv8r" (OuterVolumeSpecName: "kube-api-access-xfv8r") pod "af75f57a-7612-48c8-b3fb-8594e81e2d0a" (UID: "af75f57a-7612-48c8-b3fb-8594e81e2d0a"). InnerVolumeSpecName "kube-api-access-xfv8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.530641 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af75f57a-7612-48c8-b3fb-8594e81e2d0a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.530671 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfv8r\" (UniqueName: \"kubernetes.io/projected/af75f57a-7612-48c8-b3fb-8594e81e2d0a-kube-api-access-xfv8r\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.530685 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.530697 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/af75f57a-7612-48c8-b3fb-8594e81e2d0a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.568224 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "af75f57a-7612-48c8-b3fb-8594e81e2d0a" (UID: "af75f57a-7612-48c8-b3fb-8594e81e2d0a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.633740 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.639233 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af75f57a-7612-48c8-b3fb-8594e81e2d0a" (UID: "af75f57a-7612-48c8-b3fb-8594e81e2d0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.679258 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-config-data" (OuterVolumeSpecName: "config-data") pod "af75f57a-7612-48c8-b3fb-8594e81e2d0a" (UID: "af75f57a-7612-48c8-b3fb-8594e81e2d0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.735224 4835 scope.go:117] "RemoveContainer" containerID="2d8b620a897b5436e95f47f581862450b09d7ca2e085f89555d7329fe76e3fa1" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.737580 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.737991 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af75f57a-7612-48c8-b3fb-8594e81e2d0a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.741585 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64dcfd48b6-tpcpd" podUID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.165:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.165:8443: connect: connection refused" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.776121 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.788384 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.797549 4835 scope.go:117] "RemoveContainer" containerID="64b3d992f1132cf7ecff87b380648189301d52c52ecdb56ec440407b04213ed6" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.801830 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:10 crc kubenswrapper[4835]: E1003 18:33:10.802244 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="sg-core" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.802262 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="sg-core" Oct 03 18:33:10 crc kubenswrapper[4835]: E1003 18:33:10.802279 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="ceilometer-central-agent" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.802285 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="ceilometer-central-agent" Oct 03 18:33:10 crc kubenswrapper[4835]: E1003 18:33:10.802312 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="ceilometer-notification-agent" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.802319 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="ceilometer-notification-agent" Oct 03 18:33:10 crc kubenswrapper[4835]: E1003 18:33:10.802330 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="proxy-httpd" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.802338 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="proxy-httpd" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.802542 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="sg-core" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.802557 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="ceilometer-notification-agent" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.802569 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="proxy-httpd" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.802579 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" containerName="ceilometer-central-agent" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.804309 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.807396 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.808141 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.815498 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.834407 4835 scope.go:117] "RemoveContainer" containerID="6af7b4aca2ab42921f496c3018bb159e5b3950431d0b3ea9aebf155918194167" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.889146 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af75f57a-7612-48c8-b3fb-8594e81e2d0a" path="/var/lib/kubelet/pods/af75f57a-7612-48c8-b3fb-8594e81e2d0a/volumes" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.894202 4835 scope.go:117] "RemoveContainer" containerID="c65996a826cd8ebb9934d2351812b1e05bd9a0d4ea0f1e45173803e42601ea5a" Oct 03 18:33:10 crc kubenswrapper[4835]: E1003 18:33:10.896815 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c65996a826cd8ebb9934d2351812b1e05bd9a0d4ea0f1e45173803e42601ea5a\": container with ID starting with c65996a826cd8ebb9934d2351812b1e05bd9a0d4ea0f1e45173803e42601ea5a not found: ID does not exist" containerID="c65996a826cd8ebb9934d2351812b1e05bd9a0d4ea0f1e45173803e42601ea5a" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.896851 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65996a826cd8ebb9934d2351812b1e05bd9a0d4ea0f1e45173803e42601ea5a"} err="failed to get container status \"c65996a826cd8ebb9934d2351812b1e05bd9a0d4ea0f1e45173803e42601ea5a\": rpc error: code = NotFound desc = could not find container \"c65996a826cd8ebb9934d2351812b1e05bd9a0d4ea0f1e45173803e42601ea5a\": container with ID starting with c65996a826cd8ebb9934d2351812b1e05bd9a0d4ea0f1e45173803e42601ea5a not found: ID does not exist" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.896876 4835 scope.go:117] "RemoveContainer" containerID="2d8b620a897b5436e95f47f581862450b09d7ca2e085f89555d7329fe76e3fa1" Oct 03 18:33:10 crc kubenswrapper[4835]: E1003 18:33:10.897116 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d8b620a897b5436e95f47f581862450b09d7ca2e085f89555d7329fe76e3fa1\": container with ID starting with 2d8b620a897b5436e95f47f581862450b09d7ca2e085f89555d7329fe76e3fa1 not found: ID does not exist" containerID="2d8b620a897b5436e95f47f581862450b09d7ca2e085f89555d7329fe76e3fa1" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.897137 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d8b620a897b5436e95f47f581862450b09d7ca2e085f89555d7329fe76e3fa1"} err="failed to get container status \"2d8b620a897b5436e95f47f581862450b09d7ca2e085f89555d7329fe76e3fa1\": rpc error: code = NotFound desc = could not find container \"2d8b620a897b5436e95f47f581862450b09d7ca2e085f89555d7329fe76e3fa1\": container with ID starting with 2d8b620a897b5436e95f47f581862450b09d7ca2e085f89555d7329fe76e3fa1 not found: ID does not exist" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.897154 4835 scope.go:117] "RemoveContainer" containerID="64b3d992f1132cf7ecff87b380648189301d52c52ecdb56ec440407b04213ed6" Oct 03 18:33:10 crc kubenswrapper[4835]: E1003 18:33:10.897547 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64b3d992f1132cf7ecff87b380648189301d52c52ecdb56ec440407b04213ed6\": container with ID starting with 64b3d992f1132cf7ecff87b380648189301d52c52ecdb56ec440407b04213ed6 not found: ID does not exist" containerID="64b3d992f1132cf7ecff87b380648189301d52c52ecdb56ec440407b04213ed6" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.897567 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b3d992f1132cf7ecff87b380648189301d52c52ecdb56ec440407b04213ed6"} err="failed to get container status \"64b3d992f1132cf7ecff87b380648189301d52c52ecdb56ec440407b04213ed6\": rpc error: code = NotFound desc = could not find container \"64b3d992f1132cf7ecff87b380648189301d52c52ecdb56ec440407b04213ed6\": container with ID starting with 64b3d992f1132cf7ecff87b380648189301d52c52ecdb56ec440407b04213ed6 not found: ID does not exist" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.897579 4835 scope.go:117] "RemoveContainer" containerID="6af7b4aca2ab42921f496c3018bb159e5b3950431d0b3ea9aebf155918194167" Oct 03 18:33:10 crc kubenswrapper[4835]: E1003 18:33:10.898009 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af7b4aca2ab42921f496c3018bb159e5b3950431d0b3ea9aebf155918194167\": container with ID starting with 6af7b4aca2ab42921f496c3018bb159e5b3950431d0b3ea9aebf155918194167 not found: ID does not exist" containerID="6af7b4aca2ab42921f496c3018bb159e5b3950431d0b3ea9aebf155918194167" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.898024 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af7b4aca2ab42921f496c3018bb159e5b3950431d0b3ea9aebf155918194167"} err="failed to get container status \"6af7b4aca2ab42921f496c3018bb159e5b3950431d0b3ea9aebf155918194167\": rpc error: code = NotFound desc = could not find container \"6af7b4aca2ab42921f496c3018bb159e5b3950431d0b3ea9aebf155918194167\": container with ID starting with 6af7b4aca2ab42921f496c3018bb159e5b3950431d0b3ea9aebf155918194167 not found: ID does not exist" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.946203 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7762654-21e9-4999-b701-7498146153b2-run-httpd\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.946256 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdjmn\" (UniqueName: \"kubernetes.io/projected/c7762654-21e9-4999-b701-7498146153b2-kube-api-access-bdjmn\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.946311 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-config-data\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.946349 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7762654-21e9-4999-b701-7498146153b2-log-httpd\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.946364 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-scripts\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.946385 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:10 crc kubenswrapper[4835]: I1003 18:33:10.946432 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.048009 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7762654-21e9-4999-b701-7498146153b2-run-httpd\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.048475 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7762654-21e9-4999-b701-7498146153b2-run-httpd\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.048080 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdjmn\" (UniqueName: \"kubernetes.io/projected/c7762654-21e9-4999-b701-7498146153b2-kube-api-access-bdjmn\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.049269 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-config-data\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.049297 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7762654-21e9-4999-b701-7498146153b2-log-httpd\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.049314 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-scripts\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.049343 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.049426 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.054875 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7762654-21e9-4999-b701-7498146153b2-log-httpd\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.059612 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-scripts\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.060268 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.060889 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-config-data\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.061306 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.077296 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.085874 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdjmn\" (UniqueName: \"kubernetes.io/projected/c7762654-21e9-4999-b701-7498146153b2-kube-api-access-bdjmn\") pod \"ceilometer-0\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.147352 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.472637 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169","Type":"ContainerStarted","Data":"c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897"} Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.475398 4835 generic.go:334] "Generic (PLEG): container finished" podID="a156c146-9589-4de6-8a2e-9c6740a94542" containerID="16b1f41c6593896b48d38893b0ddc28e44afc0acec69c59e9285d085c458c55e" exitCode=0 Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.475456 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" event={"ID":"a156c146-9589-4de6-8a2e-9c6740a94542","Type":"ContainerDied","Data":"16b1f41c6593896b48d38893b0ddc28e44afc0acec69c59e9285d085c458c55e"} Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.478495 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ab985863-4ac9-44bc-977a-241abc2c4635","Type":"ContainerStarted","Data":"8496edcaf7deb698b14d7a65627efb629e03abae511efc15bc13a13f89f2df07"} Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.798040 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:11 crc kubenswrapper[4835]: W1003 18:33:11.808821 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7762654_21e9_4999_b701_7498146153b2.slice/crio-85d82407301e02fdd5f71767b6b4a24c753806dd73ab2bf21a836e46f2cb1c9e WatchSource:0}: Error finding container 85d82407301e02fdd5f71767b6b4a24c753806dd73ab2bf21a836e46f2cb1c9e: Status 404 returned error can't find the container with id 85d82407301e02fdd5f71767b6b4a24c753806dd73ab2bf21a836e46f2cb1c9e Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.820790 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.820927 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 18:33:11 crc kubenswrapper[4835]: I1003 18:33:11.822275 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.491329 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ab985863-4ac9-44bc-977a-241abc2c4635","Type":"ContainerStarted","Data":"f8c4f7746c74ef457dac60a74b3dae6438722672394bf99df6697234fecafa91"} Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.521146 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.976709998 podStartE2EDuration="4.5211247s" podCreationTimestamp="2025-10-03 18:33:08 +0000 UTC" firstStartedPulling="2025-10-03 18:33:09.4378958 +0000 UTC m=+1131.153836672" lastFinishedPulling="2025-10-03 18:33:09.982310502 +0000 UTC m=+1131.698251374" observedRunningTime="2025-10-03 18:33:12.510287026 +0000 UTC m=+1134.226227898" watchObservedRunningTime="2025-10-03 18:33:12.5211247 +0000 UTC m=+1134.237065572" Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.527530 4835 generic.go:334] "Generic (PLEG): container finished" podID="1505cc32-6896-425d-b36b-1b2d3504901b" containerID="8ae53724fccd3725c727bddb721ac81a7d0297b53ee19fe8f4276c44febb0e65" exitCode=0 Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.527634 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b45756688-tbfsl" event={"ID":"1505cc32-6896-425d-b36b-1b2d3504901b","Type":"ContainerDied","Data":"8ae53724fccd3725c727bddb721ac81a7d0297b53ee19fe8f4276c44febb0e65"} Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.543785 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169","Type":"ContainerStarted","Data":"3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4"} Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.543963 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" containerName="cinder-api-log" containerID="cri-o://c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897" gracePeriod=30 Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.544093 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.544440 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" containerName="cinder-api" containerID="cri-o://3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4" gracePeriod=30 Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.552256 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7762654-21e9-4999-b701-7498146153b2","Type":"ContainerStarted","Data":"f47b2382957676c586d20de83df106f0864dd9fcc944854a455b9bf25e1b8160"} Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.552315 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7762654-21e9-4999-b701-7498146153b2","Type":"ContainerStarted","Data":"85d82407301e02fdd5f71767b6b4a24c753806dd73ab2bf21a836e46f2cb1c9e"} Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.564537 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.565878 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" event={"ID":"a156c146-9589-4de6-8a2e-9c6740a94542","Type":"ContainerStarted","Data":"981feb0827f1e186ceb1a8923ef3f31bfaee1f23d42cb1711e983d3cd9761aca"} Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.565952 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.568117 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.568098982 podStartE2EDuration="4.568098982s" podCreationTimestamp="2025-10-03 18:33:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:33:12.562986088 +0000 UTC m=+1134.278926960" watchObservedRunningTime="2025-10-03 18:33:12.568098982 +0000 UTC m=+1134.284039854" Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.595556 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" podStartSLOduration=4.59553566 podStartE2EDuration="4.59553566s" podCreationTimestamp="2025-10-03 18:33:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:33:12.587655338 +0000 UTC m=+1134.303596210" watchObservedRunningTime="2025-10-03 18:33:12.59553566 +0000 UTC m=+1134.311476532" Oct 03 18:33:12 crc kubenswrapper[4835]: I1003 18:33:12.825286 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/watcher-api-0" podUID="0aa3d7ce-c1f2-40a5-b63b-b39daee108fb" containerName="watcher-api-log" probeResult="failure" output="Get \"https://10.217.0.178:9322/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.018913 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.122029 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.127519 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-config-data\") pod \"1505cc32-6896-425d-b36b-1b2d3504901b\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.127588 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-combined-ca-bundle\") pod \"1505cc32-6896-425d-b36b-1b2d3504901b\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.127695 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4snv\" (UniqueName: \"kubernetes.io/projected/1505cc32-6896-425d-b36b-1b2d3504901b-kube-api-access-g4snv\") pod \"1505cc32-6896-425d-b36b-1b2d3504901b\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.128160 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-config-data-custom\") pod \"1505cc32-6896-425d-b36b-1b2d3504901b\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.128473 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1505cc32-6896-425d-b36b-1b2d3504901b-logs\") pod \"1505cc32-6896-425d-b36b-1b2d3504901b\" (UID: \"1505cc32-6896-425d-b36b-1b2d3504901b\") " Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.129462 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1505cc32-6896-425d-b36b-1b2d3504901b-logs" (OuterVolumeSpecName: "logs") pod "1505cc32-6896-425d-b36b-1b2d3504901b" (UID: "1505cc32-6896-425d-b36b-1b2d3504901b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.137291 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1505cc32-6896-425d-b36b-1b2d3504901b-kube-api-access-g4snv" (OuterVolumeSpecName: "kube-api-access-g4snv") pod "1505cc32-6896-425d-b36b-1b2d3504901b" (UID: "1505cc32-6896-425d-b36b-1b2d3504901b"). InnerVolumeSpecName "kube-api-access-g4snv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.141164 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1505cc32-6896-425d-b36b-1b2d3504901b" (UID: "1505cc32-6896-425d-b36b-1b2d3504901b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.174164 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1505cc32-6896-425d-b36b-1b2d3504901b" (UID: "1505cc32-6896-425d-b36b-1b2d3504901b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.209342 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-config-data" (OuterVolumeSpecName: "config-data") pod "1505cc32-6896-425d-b36b-1b2d3504901b" (UID: "1505cc32-6896-425d-b36b-1b2d3504901b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.230605 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1505cc32-6896-425d-b36b-1b2d3504901b-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.230648 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.230662 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.230674 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4snv\" (UniqueName: \"kubernetes.io/projected/1505cc32-6896-425d-b36b-1b2d3504901b-kube-api-access-g4snv\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.230686 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1505cc32-6896-425d-b36b-1b2d3504901b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.556667 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.579547 4835 generic.go:334] "Generic (PLEG): container finished" podID="4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" containerID="3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4" exitCode=0 Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.579583 4835 generic.go:334] "Generic (PLEG): container finished" podID="4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" containerID="c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897" exitCode=143 Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.579629 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169","Type":"ContainerDied","Data":"3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4"} Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.579671 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169","Type":"ContainerDied","Data":"c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897"} Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.579685 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169","Type":"ContainerDied","Data":"1e3aa2b721de4b85ba2889babf74538322b6f0939ca9322e039299f5f0e61aa1"} Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.579703 4835 scope.go:117] "RemoveContainer" containerID="3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.579852 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.584461 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7762654-21e9-4999-b701-7498146153b2","Type":"ContainerStarted","Data":"34da65470a92bff1873d6194522087c396885e0763a63c540d1699999c59c7d5"} Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.590715 4835 generic.go:334] "Generic (PLEG): container finished" podID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerID="93a43d0e5cc9cf9fc0d017857367c8f6de72ed7cb5c9b28e99b416df23f0b755" exitCode=1 Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.590764 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8110d0e5-9e19-4306-b8aa-babe937e8d2a","Type":"ContainerDied","Data":"93a43d0e5cc9cf9fc0d017857367c8f6de72ed7cb5c9b28e99b416df23f0b755"} Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.591368 4835 scope.go:117] "RemoveContainer" containerID="93a43d0e5cc9cf9fc0d017857367c8f6de72ed7cb5c9b28e99b416df23f0b755" Oct 03 18:33:13 crc kubenswrapper[4835]: E1003 18:33:13.591655 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8110d0e5-9e19-4306-b8aa-babe937e8d2a)\"" pod="openstack/watcher-decision-engine-0" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.600131 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b45756688-tbfsl" event={"ID":"1505cc32-6896-425d-b36b-1b2d3504901b","Type":"ContainerDied","Data":"3786801d1aa6390ca652db72432821110148bfc2193c91680ff9359675379e1a"} Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.600237 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b45756688-tbfsl" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.624459 4835 scope.go:117] "RemoveContainer" containerID="c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.667150 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5b45756688-tbfsl"] Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.672300 4835 scope.go:117] "RemoveContainer" containerID="3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.673644 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5b45756688-tbfsl"] Oct 03 18:33:13 crc kubenswrapper[4835]: E1003 18:33:13.677290 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4\": container with ID starting with 3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4 not found: ID does not exist" containerID="3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.677335 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4"} err="failed to get container status \"3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4\": rpc error: code = NotFound desc = could not find container \"3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4\": container with ID starting with 3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4 not found: ID does not exist" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.677360 4835 scope.go:117] "RemoveContainer" containerID="c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897" Oct 03 18:33:13 crc kubenswrapper[4835]: E1003 18:33:13.678220 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897\": container with ID starting with c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897 not found: ID does not exist" containerID="c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.678263 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897"} err="failed to get container status \"c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897\": rpc error: code = NotFound desc = could not find container \"c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897\": container with ID starting with c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897 not found: ID does not exist" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.678303 4835 scope.go:117] "RemoveContainer" containerID="3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.678732 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4"} err="failed to get container status \"3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4\": rpc error: code = NotFound desc = could not find container \"3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4\": container with ID starting with 3478bed67a7bb3a742466ddcb44fa47b3439a25ecd2fc637b0c32b061c286aa4 not found: ID does not exist" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.678762 4835 scope.go:117] "RemoveContainer" containerID="c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.679044 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897"} err="failed to get container status \"c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897\": rpc error: code = NotFound desc = could not find container \"c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897\": container with ID starting with c685020a6c7336af51d85265d27db3ecc65edfed5bdbe0ae2dd58bed766bd897 not found: ID does not exist" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.679155 4835 scope.go:117] "RemoveContainer" containerID="e806a91c1d0ed6742cb427f345aabf9762384e27525ea6c3dea58c86eb291aac" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.742581 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-etc-machine-id\") pod \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.742632 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2qhj\" (UniqueName: \"kubernetes.io/projected/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-kube-api-access-x2qhj\") pod \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.742690 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-scripts\") pod \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.742739 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-config-data-custom\") pod \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.742762 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-config-data\") pod \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.742834 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-combined-ca-bundle\") pod \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.742971 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-logs\") pod \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\" (UID: \"4c9812bb-d7b4-4c85-bc8c-318d2fbf4169\") " Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.743669 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" (UID: "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.744158 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-logs" (OuterVolumeSpecName: "logs") pod "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" (UID: "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.750787 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-kube-api-access-x2qhj" (OuterVolumeSpecName: "kube-api-access-x2qhj") pod "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" (UID: "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169"). InnerVolumeSpecName "kube-api-access-x2qhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.755177 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" (UID: "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.758526 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-scripts" (OuterVolumeSpecName: "scripts") pod "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" (UID: "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.761468 4835 scope.go:117] "RemoveContainer" containerID="8ae53724fccd3725c727bddb721ac81a7d0297b53ee19fe8f4276c44febb0e65" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.802023 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.810285 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" (UID: "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.816051 4835 scope.go:117] "RemoveContainer" containerID="5260b2300fda23f80796cae4e675c763968e93c918171e583c4a45af8379ec71" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.838329 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-config-data" (OuterVolumeSpecName: "config-data") pod "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" (UID: "4c9812bb-d7b4-4c85-bc8c-318d2fbf4169"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.845420 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.845517 4835 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.845576 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2qhj\" (UniqueName: \"kubernetes.io/projected/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-kube-api-access-x2qhj\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.845636 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.845688 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.845745 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.845837 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.908524 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.916306 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.929679 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 18:33:13 crc kubenswrapper[4835]: E1003 18:33:13.933436 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1505cc32-6896-425d-b36b-1b2d3504901b" containerName="barbican-api" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.933467 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1505cc32-6896-425d-b36b-1b2d3504901b" containerName="barbican-api" Oct 03 18:33:13 crc kubenswrapper[4835]: E1003 18:33:13.933480 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1505cc32-6896-425d-b36b-1b2d3504901b" containerName="barbican-api-log" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.933487 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1505cc32-6896-425d-b36b-1b2d3504901b" containerName="barbican-api-log" Oct 03 18:33:13 crc kubenswrapper[4835]: E1003 18:33:13.933523 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" containerName="cinder-api" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.933531 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" containerName="cinder-api" Oct 03 18:33:13 crc kubenswrapper[4835]: E1003 18:33:13.933554 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" containerName="cinder-api-log" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.933561 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" containerName="cinder-api-log" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.933799 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1505cc32-6896-425d-b36b-1b2d3504901b" containerName="barbican-api" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.933816 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1505cc32-6896-425d-b36b-1b2d3504901b" containerName="barbican-api-log" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.933826 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" containerName="cinder-api" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.933853 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" containerName="cinder-api-log" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.935195 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.938186 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.938425 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.938510 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 18:33:13 crc kubenswrapper[4835]: I1003 18:33:13.952750 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.049663 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d985d9ce-6643-4a1f-a889-4a61beb59bfa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.049862 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wctx5\" (UniqueName: \"kubernetes.io/projected/d985d9ce-6643-4a1f-a889-4a61beb59bfa-kube-api-access-wctx5\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.050859 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d985d9ce-6643-4a1f-a889-4a61beb59bfa-logs\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.050936 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-config-data-custom\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.051087 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.051235 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.051267 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-config-data\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.051331 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-scripts\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.051416 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.174840 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wctx5\" (UniqueName: \"kubernetes.io/projected/d985d9ce-6643-4a1f-a889-4a61beb59bfa-kube-api-access-wctx5\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.174945 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d985d9ce-6643-4a1f-a889-4a61beb59bfa-logs\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.174980 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-config-data-custom\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.175032 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.175089 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.175104 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-config-data\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.175132 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-scripts\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.175171 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.175561 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d985d9ce-6643-4a1f-a889-4a61beb59bfa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.175660 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d985d9ce-6643-4a1f-a889-4a61beb59bfa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.176088 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d985d9ce-6643-4a1f-a889-4a61beb59bfa-logs\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.182806 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-config-data\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.182853 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-config-data-custom\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.184109 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.184768 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.184842 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.191650 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d985d9ce-6643-4a1f-a889-4a61beb59bfa-scripts\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.192698 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wctx5\" (UniqueName: \"kubernetes.io/projected/d985d9ce-6643-4a1f-a889-4a61beb59bfa-kube-api-access-wctx5\") pod \"cinder-api-0\" (UID: \"d985d9ce-6643-4a1f-a889-4a61beb59bfa\") " pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.253800 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.610723 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7762654-21e9-4999-b701-7498146153b2","Type":"ContainerStarted","Data":"0f6d996612c55cd95822a881c0a094bac2d7d280e2f7fdfbf163ace4f5e5fa01"} Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.614183 4835 generic.go:334] "Generic (PLEG): container finished" podID="4f142f3b-9cce-451e-82b0-bfdac3ec661c" containerID="77b55f5478320213dad5b94639c84645d6fcdc901ad041694eeb730d015016f5" exitCode=0 Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.614234 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d274k" event={"ID":"4f142f3b-9cce-451e-82b0-bfdac3ec661c","Type":"ContainerDied","Data":"77b55f5478320213dad5b94639c84645d6fcdc901ad041694eeb730d015016f5"} Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.736888 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.895964 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1505cc32-6896-425d-b36b-1b2d3504901b" path="/var/lib/kubelet/pods/1505cc32-6896-425d-b36b-1b2d3504901b/volumes" Oct 03 18:33:14 crc kubenswrapper[4835]: I1003 18:33:14.897197 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c9812bb-d7b4-4c85-bc8c-318d2fbf4169" path="/var/lib/kubelet/pods/4c9812bb-d7b4-4c85-bc8c-318d2fbf4169/volumes" Oct 03 18:33:15 crc kubenswrapper[4835]: I1003 18:33:15.629147 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d985d9ce-6643-4a1f-a889-4a61beb59bfa","Type":"ContainerStarted","Data":"e1f74e0d130a68f4fa2c902b1044762f143eda20982d4bf4011413de271f2445"} Oct 03 18:33:15 crc kubenswrapper[4835]: I1003 18:33:15.629496 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d985d9ce-6643-4a1f-a889-4a61beb59bfa","Type":"ContainerStarted","Data":"3c75f447d28695eeb8e95ccef53f7667183e49434f1de1664f971643c33f2bcb"} Oct 03 18:33:15 crc kubenswrapper[4835]: I1003 18:33:15.635519 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7762654-21e9-4999-b701-7498146153b2","Type":"ContainerStarted","Data":"8a4226ecb007fd3ecbc03cb2ea96919d03f62135876d60e5152d7cfa3d1763e3"} Oct 03 18:33:15 crc kubenswrapper[4835]: I1003 18:33:15.635680 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 18:33:15 crc kubenswrapper[4835]: I1003 18:33:15.664312 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.548702671 podStartE2EDuration="5.664289858s" podCreationTimestamp="2025-10-03 18:33:10 +0000 UTC" firstStartedPulling="2025-10-03 18:33:11.822296721 +0000 UTC m=+1133.538237593" lastFinishedPulling="2025-10-03 18:33:14.937883908 +0000 UTC m=+1136.653824780" observedRunningTime="2025-10-03 18:33:15.661285094 +0000 UTC m=+1137.377225966" watchObservedRunningTime="2025-10-03 18:33:15.664289858 +0000 UTC m=+1137.380230730" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.068401 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d274k" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.211499 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f142f3b-9cce-451e-82b0-bfdac3ec661c-config\") pod \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\" (UID: \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\") " Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.211960 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f142f3b-9cce-451e-82b0-bfdac3ec661c-combined-ca-bundle\") pod \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\" (UID: \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\") " Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.212130 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbh4l\" (UniqueName: \"kubernetes.io/projected/4f142f3b-9cce-451e-82b0-bfdac3ec661c-kube-api-access-dbh4l\") pod \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\" (UID: \"4f142f3b-9cce-451e-82b0-bfdac3ec661c\") " Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.218960 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f142f3b-9cce-451e-82b0-bfdac3ec661c-kube-api-access-dbh4l" (OuterVolumeSpecName: "kube-api-access-dbh4l") pod "4f142f3b-9cce-451e-82b0-bfdac3ec661c" (UID: "4f142f3b-9cce-451e-82b0-bfdac3ec661c"). InnerVolumeSpecName "kube-api-access-dbh4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.244039 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f142f3b-9cce-451e-82b0-bfdac3ec661c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f142f3b-9cce-451e-82b0-bfdac3ec661c" (UID: "4f142f3b-9cce-451e-82b0-bfdac3ec661c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.249719 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f142f3b-9cce-451e-82b0-bfdac3ec661c-config" (OuterVolumeSpecName: "config") pod "4f142f3b-9cce-451e-82b0-bfdac3ec661c" (UID: "4f142f3b-9cce-451e-82b0-bfdac3ec661c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.315262 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbh4l\" (UniqueName: \"kubernetes.io/projected/4f142f3b-9cce-451e-82b0-bfdac3ec661c-kube-api-access-dbh4l\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.315322 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f142f3b-9cce-451e-82b0-bfdac3ec661c-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.315338 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f142f3b-9cce-451e-82b0-bfdac3ec661c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.649766 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d274k" event={"ID":"4f142f3b-9cce-451e-82b0-bfdac3ec661c","Type":"ContainerDied","Data":"66acab9f51780b5c23866d9c03343cc0e58cc37bb2b57fa34e46e79006182dcb"} Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.649800 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d274k" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.649808 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66acab9f51780b5c23866d9c03343cc0e58cc37bb2b57fa34e46e79006182dcb" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.654914 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d985d9ce-6643-4a1f-a889-4a61beb59bfa","Type":"ContainerStarted","Data":"f07b10282b51288f017e841965506fe44fdb9e133aa499725293abe6fd165505"} Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.687737 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.687717733 podStartE2EDuration="3.687717733s" podCreationTimestamp="2025-10-03 18:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:33:16.677381301 +0000 UTC m=+1138.393322183" watchObservedRunningTime="2025-10-03 18:33:16.687717733 +0000 UTC m=+1138.403658595" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.912804 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7544fb748f-gl6nd"] Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.919836 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.928185 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" podUID="a156c146-9589-4de6-8a2e-9c6740a94542" containerName="dnsmasq-dns" containerID="cri-o://981feb0827f1e186ceb1a8923ef3f31bfaee1f23d42cb1711e983d3cd9761aca" gracePeriod=10 Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.979408 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d779b7fdc-pqmkj"] Oct 03 18:33:16 crc kubenswrapper[4835]: E1003 18:33:16.980295 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f142f3b-9cce-451e-82b0-bfdac3ec661c" containerName="neutron-db-sync" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.980313 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f142f3b-9cce-451e-82b0-bfdac3ec661c" containerName="neutron-db-sync" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.980582 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f142f3b-9cce-451e-82b0-bfdac3ec661c" containerName="neutron-db-sync" Oct 03 18:33:16 crc kubenswrapper[4835]: I1003 18:33:16.981833 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.002289 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d779b7fdc-pqmkj"] Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.043168 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bc8b99c48-vnw8l"] Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.044857 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.048027 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.048237 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.048467 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.048967 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-k4s6x" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.051898 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bc8b99c48-vnw8l"] Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.162666 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7l57\" (UniqueName: \"kubernetes.io/projected/e203cdb5-ef30-469a-bc4e-3bae2306043d-kube-api-access-g7l57\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.162709 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-config\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.162749 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-httpd-config\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.162781 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-ovsdbserver-sb\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.162804 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-ovndb-tls-certs\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.162971 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-dns-swift-storage-0\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.163088 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-dns-svc\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.163135 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-config\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.163229 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jp6x\" (UniqueName: \"kubernetes.io/projected/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-kube-api-access-2jp6x\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.163260 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-ovsdbserver-nb\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.163334 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-combined-ca-bundle\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.265017 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-ovsdbserver-sb\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.265153 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-ovndb-tls-certs\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.265237 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-dns-swift-storage-0\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.265279 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-dns-svc\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.265339 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-config\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.265420 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jp6x\" (UniqueName: \"kubernetes.io/projected/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-kube-api-access-2jp6x\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.265473 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-ovsdbserver-nb\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.265522 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-combined-ca-bundle\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.265606 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7l57\" (UniqueName: \"kubernetes.io/projected/e203cdb5-ef30-469a-bc4e-3bae2306043d-kube-api-access-g7l57\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.265658 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-config\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.265732 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-httpd-config\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.265975 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-ovsdbserver-sb\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.266561 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-dns-svc\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.267131 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-dns-swift-storage-0\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.268247 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-ovsdbserver-nb\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.270980 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-config\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.281309 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-httpd-config\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.281864 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-ovndb-tls-certs\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.286301 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-combined-ca-bundle\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.287903 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jp6x\" (UniqueName: \"kubernetes.io/projected/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-kube-api-access-2jp6x\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.288021 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-config\") pod \"neutron-bc8b99c48-vnw8l\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.291442 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7l57\" (UniqueName: \"kubernetes.io/projected/e203cdb5-ef30-469a-bc4e-3bae2306043d-kube-api-access-g7l57\") pod \"dnsmasq-dns-d779b7fdc-pqmkj\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.328199 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.407309 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.538883 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.682603 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-dns-swift-storage-0\") pod \"a156c146-9589-4de6-8a2e-9c6740a94542\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.682685 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd7pz\" (UniqueName: \"kubernetes.io/projected/a156c146-9589-4de6-8a2e-9c6740a94542-kube-api-access-hd7pz\") pod \"a156c146-9589-4de6-8a2e-9c6740a94542\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.682705 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-config\") pod \"a156c146-9589-4de6-8a2e-9c6740a94542\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.682780 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-dns-svc\") pod \"a156c146-9589-4de6-8a2e-9c6740a94542\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.682872 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-ovsdbserver-nb\") pod \"a156c146-9589-4de6-8a2e-9c6740a94542\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.682900 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-ovsdbserver-sb\") pod \"a156c146-9589-4de6-8a2e-9c6740a94542\" (UID: \"a156c146-9589-4de6-8a2e-9c6740a94542\") " Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.713369 4835 generic.go:334] "Generic (PLEG): container finished" podID="a156c146-9589-4de6-8a2e-9c6740a94542" containerID="981feb0827f1e186ceb1a8923ef3f31bfaee1f23d42cb1711e983d3cd9761aca" exitCode=0 Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.714003 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.714340 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" event={"ID":"a156c146-9589-4de6-8a2e-9c6740a94542","Type":"ContainerDied","Data":"981feb0827f1e186ceb1a8923ef3f31bfaee1f23d42cb1711e983d3cd9761aca"} Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.714373 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7544fb748f-gl6nd" event={"ID":"a156c146-9589-4de6-8a2e-9c6740a94542","Type":"ContainerDied","Data":"20e836339cf2ff4d474dafc8c728187e0c0a5f51660378abf0837538eb6b0497"} Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.714387 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.714407 4835 scope.go:117] "RemoveContainer" containerID="981feb0827f1e186ceb1a8923ef3f31bfaee1f23d42cb1711e983d3cd9761aca" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.724205 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a156c146-9589-4de6-8a2e-9c6740a94542-kube-api-access-hd7pz" (OuterVolumeSpecName: "kube-api-access-hd7pz") pod "a156c146-9589-4de6-8a2e-9c6740a94542" (UID: "a156c146-9589-4de6-8a2e-9c6740a94542"). InnerVolumeSpecName "kube-api-access-hd7pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.775468 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a156c146-9589-4de6-8a2e-9c6740a94542" (UID: "a156c146-9589-4de6-8a2e-9c6740a94542"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.785512 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd7pz\" (UniqueName: \"kubernetes.io/projected/a156c146-9589-4de6-8a2e-9c6740a94542-kube-api-access-hd7pz\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.785542 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:17 crc kubenswrapper[4835]: W1003 18:33:17.790022 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode203cdb5_ef30_469a_bc4e_3bae2306043d.slice/crio-44b6ab078a4a1f621363e1dae21ec0b86b606c41fdc4e421967d3dad539f8e72 WatchSource:0}: Error finding container 44b6ab078a4a1f621363e1dae21ec0b86b606c41fdc4e421967d3dad539f8e72: Status 404 returned error can't find the container with id 44b6ab078a4a1f621363e1dae21ec0b86b606c41fdc4e421967d3dad539f8e72 Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.802508 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d779b7fdc-pqmkj"] Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.802824 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a156c146-9589-4de6-8a2e-9c6740a94542" (UID: "a156c146-9589-4de6-8a2e-9c6740a94542"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.812698 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a156c146-9589-4de6-8a2e-9c6740a94542" (UID: "a156c146-9589-4de6-8a2e-9c6740a94542"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.827379 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a156c146-9589-4de6-8a2e-9c6740a94542" (UID: "a156c146-9589-4de6-8a2e-9c6740a94542"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.844509 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-config" (OuterVolumeSpecName: "config") pod "a156c146-9589-4de6-8a2e-9c6740a94542" (UID: "a156c146-9589-4de6-8a2e-9c6740a94542"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.866590 4835 scope.go:117] "RemoveContainer" containerID="16b1f41c6593896b48d38893b0ddc28e44afc0acec69c59e9285d085c458c55e" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.887532 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.887759 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.887855 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.887912 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a156c146-9589-4de6-8a2e-9c6740a94542-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.898624 4835 scope.go:117] "RemoveContainer" containerID="981feb0827f1e186ceb1a8923ef3f31bfaee1f23d42cb1711e983d3cd9761aca" Oct 03 18:33:17 crc kubenswrapper[4835]: E1003 18:33:17.900051 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"981feb0827f1e186ceb1a8923ef3f31bfaee1f23d42cb1711e983d3cd9761aca\": container with ID starting with 981feb0827f1e186ceb1a8923ef3f31bfaee1f23d42cb1711e983d3cd9761aca not found: ID does not exist" containerID="981feb0827f1e186ceb1a8923ef3f31bfaee1f23d42cb1711e983d3cd9761aca" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.900129 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981feb0827f1e186ceb1a8923ef3f31bfaee1f23d42cb1711e983d3cd9761aca"} err="failed to get container status \"981feb0827f1e186ceb1a8923ef3f31bfaee1f23d42cb1711e983d3cd9761aca\": rpc error: code = NotFound desc = could not find container \"981feb0827f1e186ceb1a8923ef3f31bfaee1f23d42cb1711e983d3cd9761aca\": container with ID starting with 981feb0827f1e186ceb1a8923ef3f31bfaee1f23d42cb1711e983d3cd9761aca not found: ID does not exist" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.900151 4835 scope.go:117] "RemoveContainer" containerID="16b1f41c6593896b48d38893b0ddc28e44afc0acec69c59e9285d085c458c55e" Oct 03 18:33:17 crc kubenswrapper[4835]: E1003 18:33:17.900403 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16b1f41c6593896b48d38893b0ddc28e44afc0acec69c59e9285d085c458c55e\": container with ID starting with 16b1f41c6593896b48d38893b0ddc28e44afc0acec69c59e9285d085c458c55e not found: ID does not exist" containerID="16b1f41c6593896b48d38893b0ddc28e44afc0acec69c59e9285d085c458c55e" Oct 03 18:33:17 crc kubenswrapper[4835]: I1003 18:33:17.900438 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b1f41c6593896b48d38893b0ddc28e44afc0acec69c59e9285d085c458c55e"} err="failed to get container status \"16b1f41c6593896b48d38893b0ddc28e44afc0acec69c59e9285d085c458c55e\": rpc error: code = NotFound desc = could not find container \"16b1f41c6593896b48d38893b0ddc28e44afc0acec69c59e9285d085c458c55e\": container with ID starting with 16b1f41c6593896b48d38893b0ddc28e44afc0acec69c59e9285d085c458c55e not found: ID does not exist" Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.047290 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7544fb748f-gl6nd"] Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.056393 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7544fb748f-gl6nd"] Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.166308 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.166630 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.167465 4835 scope.go:117] "RemoveContainer" containerID="93a43d0e5cc9cf9fc0d017857367c8f6de72ed7cb5c9b28e99b416df23f0b755" Oct 03 18:33:18 crc kubenswrapper[4835]: E1003 18:33:18.167736 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8110d0e5-9e19-4306-b8aa-babe937e8d2a)\"" pod="openstack/watcher-decision-engine-0" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" Oct 03 18:33:18 crc kubenswrapper[4835]: W1003 18:33:18.172814 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd775c7bb_fbc5_42d0_9f09_9edfac2d88e3.slice/crio-2fa3eb268d80fa91270fa7fc799e9cef57a9ad0e20620a4b1ee4b0ad66594dbe WatchSource:0}: Error finding container 2fa3eb268d80fa91270fa7fc799e9cef57a9ad0e20620a4b1ee4b0ad66594dbe: Status 404 returned error can't find the container with id 2fa3eb268d80fa91270fa7fc799e9cef57a9ad0e20620a4b1ee4b0ad66594dbe Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.173708 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bc8b99c48-vnw8l"] Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.723705 4835 generic.go:334] "Generic (PLEG): container finished" podID="e203cdb5-ef30-469a-bc4e-3bae2306043d" containerID="41068cf045e7c4a39ad4c034b2f988174ffbfe924d3cce62f6211a4283b6cfff" exitCode=0 Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.724031 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" event={"ID":"e203cdb5-ef30-469a-bc4e-3bae2306043d","Type":"ContainerDied","Data":"41068cf045e7c4a39ad4c034b2f988174ffbfe924d3cce62f6211a4283b6cfff"} Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.724058 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" event={"ID":"e203cdb5-ef30-469a-bc4e-3bae2306043d","Type":"ContainerStarted","Data":"44b6ab078a4a1f621363e1dae21ec0b86b606c41fdc4e421967d3dad539f8e72"} Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.729123 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc8b99c48-vnw8l" event={"ID":"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3","Type":"ContainerStarted","Data":"420f7d3d203ffe2fdbff1a6c97f48d176d2b897cfde5fc2523caa34a448cf47a"} Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.729157 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc8b99c48-vnw8l" event={"ID":"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3","Type":"ContainerStarted","Data":"e1e711ba78a8bc75bfb1e51e87b4144137d3d0a868e3ef228156096a8eb936b8"} Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.729167 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc8b99c48-vnw8l" event={"ID":"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3","Type":"ContainerStarted","Data":"2fa3eb268d80fa91270fa7fc799e9cef57a9ad0e20620a4b1ee4b0ad66594dbe"} Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.729437 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.792790 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-bc8b99c48-vnw8l" podStartSLOduration=2.792775398 podStartE2EDuration="2.792775398s" podCreationTimestamp="2025-10-03 18:33:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:33:18.787703275 +0000 UTC m=+1140.503644147" watchObservedRunningTime="2025-10-03 18:33:18.792775398 +0000 UTC m=+1140.508716270" Oct 03 18:33:18 crc kubenswrapper[4835]: I1003 18:33:18.955490 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a156c146-9589-4de6-8a2e-9c6740a94542" path="/var/lib/kubelet/pods/a156c146-9589-4de6-8a2e-9c6740a94542/volumes" Oct 03 18:33:19 crc kubenswrapper[4835]: I1003 18:33:19.121417 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 18:33:19 crc kubenswrapper[4835]: I1003 18:33:19.196246 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 18:33:19 crc kubenswrapper[4835]: I1003 18:33:19.741464 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" event={"ID":"e203cdb5-ef30-469a-bc4e-3bae2306043d","Type":"ContainerStarted","Data":"c3f706c82d7f322ce42be521f77e1ac32da0134e3f8d4db9cd0f902102354180"} Oct 03 18:33:19 crc kubenswrapper[4835]: I1003 18:33:19.741631 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ab985863-4ac9-44bc-977a-241abc2c4635" containerName="cinder-scheduler" containerID="cri-o://8496edcaf7deb698b14d7a65627efb629e03abae511efc15bc13a13f89f2df07" gracePeriod=30 Oct 03 18:33:19 crc kubenswrapper[4835]: I1003 18:33:19.741712 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:19 crc kubenswrapper[4835]: I1003 18:33:19.741748 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ab985863-4ac9-44bc-977a-241abc2c4635" containerName="probe" containerID="cri-o://f8c4f7746c74ef457dac60a74b3dae6438722672394bf99df6697234fecafa91" gracePeriod=30 Oct 03 18:33:19 crc kubenswrapper[4835]: I1003 18:33:19.767719 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" podStartSLOduration=3.767702913 podStartE2EDuration="3.767702913s" podCreationTimestamp="2025-10-03 18:33:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:33:19.761143234 +0000 UTC m=+1141.477084116" watchObservedRunningTime="2025-10-03 18:33:19.767702913 +0000 UTC m=+1141.483643785" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.109454 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-84d4b96669-666zm"] Oct 03 18:33:20 crc kubenswrapper[4835]: E1003 18:33:20.110127 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a156c146-9589-4de6-8a2e-9c6740a94542" containerName="init" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.110144 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a156c146-9589-4de6-8a2e-9c6740a94542" containerName="init" Oct 03 18:33:20 crc kubenswrapper[4835]: E1003 18:33:20.110160 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a156c146-9589-4de6-8a2e-9c6740a94542" containerName="dnsmasq-dns" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.110166 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a156c146-9589-4de6-8a2e-9c6740a94542" containerName="dnsmasq-dns" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.110350 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a156c146-9589-4de6-8a2e-9c6740a94542" containerName="dnsmasq-dns" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.111385 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.114157 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.114253 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.125197 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84d4b96669-666zm"] Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.244053 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-public-tls-certs\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.244259 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-config\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.244460 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-combined-ca-bundle\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.244526 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-httpd-config\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.244618 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-internal-tls-certs\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.244654 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-ovndb-tls-certs\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.244689 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkh8x\" (UniqueName: \"kubernetes.io/projected/c9649777-5191-4e89-a8b0-a164e4998af6-kube-api-access-xkh8x\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.347055 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-combined-ca-bundle\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.347328 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-httpd-config\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.347462 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-internal-tls-certs\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.347538 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-ovndb-tls-certs\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.347627 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkh8x\" (UniqueName: \"kubernetes.io/projected/c9649777-5191-4e89-a8b0-a164e4998af6-kube-api-access-xkh8x\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.347760 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-public-tls-certs\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.347858 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-config\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.353267 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-httpd-config\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.353802 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-internal-tls-certs\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.355825 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-config\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.355836 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-public-tls-certs\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.355850 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-ovndb-tls-certs\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.356948 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9649777-5191-4e89-a8b0-a164e4998af6-combined-ca-bundle\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.380094 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkh8x\" (UniqueName: \"kubernetes.io/projected/c9649777-5191-4e89-a8b0-a164e4998af6-kube-api-access-xkh8x\") pod \"neutron-84d4b96669-666zm\" (UID: \"c9649777-5191-4e89-a8b0-a164e4998af6\") " pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.447356 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.743486 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64dcfd48b6-tpcpd" podUID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.165:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.165:8443: connect: connection refused" Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.756476 4835 generic.go:334] "Generic (PLEG): container finished" podID="ab985863-4ac9-44bc-977a-241abc2c4635" containerID="f8c4f7746c74ef457dac60a74b3dae6438722672394bf99df6697234fecafa91" exitCode=0 Oct 03 18:33:20 crc kubenswrapper[4835]: I1003 18:33:20.756564 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ab985863-4ac9-44bc-977a-241abc2c4635","Type":"ContainerDied","Data":"f8c4f7746c74ef457dac60a74b3dae6438722672394bf99df6697234fecafa91"} Oct 03 18:33:21 crc kubenswrapper[4835]: I1003 18:33:21.071634 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84d4b96669-666zm"] Oct 03 18:33:21 crc kubenswrapper[4835]: I1003 18:33:21.778671 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84d4b96669-666zm" event={"ID":"c9649777-5191-4e89-a8b0-a164e4998af6","Type":"ContainerStarted","Data":"91db45664d99ba2c5cc9be96b51632e3581d50a24e5f6988c8d88e481da4e0cb"} Oct 03 18:33:21 crc kubenswrapper[4835]: I1003 18:33:21.779252 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:21 crc kubenswrapper[4835]: I1003 18:33:21.779284 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84d4b96669-666zm" event={"ID":"c9649777-5191-4e89-a8b0-a164e4998af6","Type":"ContainerStarted","Data":"64ac1dcae543e8a75cc8a83a61f2fbeb67407a94a61ac2263691fbbbfa3f57fa"} Oct 03 18:33:21 crc kubenswrapper[4835]: I1003 18:33:21.779303 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84d4b96669-666zm" event={"ID":"c9649777-5191-4e89-a8b0-a164e4998af6","Type":"ContainerStarted","Data":"adc2712874695abe6f1262794f404995e655c7d4a18740d3f5cd64e38392c6f0"} Oct 03 18:33:21 crc kubenswrapper[4835]: I1003 18:33:21.806340 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-84d4b96669-666zm" podStartSLOduration=1.806320753 podStartE2EDuration="1.806320753s" podCreationTimestamp="2025-10-03 18:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:33:21.801619679 +0000 UTC m=+1143.517560551" watchObservedRunningTime="2025-10-03 18:33:21.806320753 +0000 UTC m=+1143.522261625" Oct 03 18:33:21 crc kubenswrapper[4835]: I1003 18:33:21.825208 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 03 18:33:21 crc kubenswrapper[4835]: I1003 18:33:21.831195 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 03 18:33:23 crc kubenswrapper[4835]: I1003 18:33:23.805555 4835 generic.go:334] "Generic (PLEG): container finished" podID="ab985863-4ac9-44bc-977a-241abc2c4635" containerID="8496edcaf7deb698b14d7a65627efb629e03abae511efc15bc13a13f89f2df07" exitCode=0 Oct 03 18:33:23 crc kubenswrapper[4835]: I1003 18:33:23.805671 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ab985863-4ac9-44bc-977a-241abc2c4635","Type":"ContainerDied","Data":"8496edcaf7deb698b14d7a65627efb629e03abae511efc15bc13a13f89f2df07"} Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.123979 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.219707 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab985863-4ac9-44bc-977a-241abc2c4635-etc-machine-id\") pod \"ab985863-4ac9-44bc-977a-241abc2c4635\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.219780 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-combined-ca-bundle\") pod \"ab985863-4ac9-44bc-977a-241abc2c4635\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.219817 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-scripts\") pod \"ab985863-4ac9-44bc-977a-241abc2c4635\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.219869 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg5s7\" (UniqueName: \"kubernetes.io/projected/ab985863-4ac9-44bc-977a-241abc2c4635-kube-api-access-sg5s7\") pod \"ab985863-4ac9-44bc-977a-241abc2c4635\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.219931 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-config-data-custom\") pod \"ab985863-4ac9-44bc-977a-241abc2c4635\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.220035 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-config-data\") pod \"ab985863-4ac9-44bc-977a-241abc2c4635\" (UID: \"ab985863-4ac9-44bc-977a-241abc2c4635\") " Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.220285 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab985863-4ac9-44bc-977a-241abc2c4635-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ab985863-4ac9-44bc-977a-241abc2c4635" (UID: "ab985863-4ac9-44bc-977a-241abc2c4635"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.221594 4835 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ab985863-4ac9-44bc-977a-241abc2c4635-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.226263 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab985863-4ac9-44bc-977a-241abc2c4635-kube-api-access-sg5s7" (OuterVolumeSpecName: "kube-api-access-sg5s7") pod "ab985863-4ac9-44bc-977a-241abc2c4635" (UID: "ab985863-4ac9-44bc-977a-241abc2c4635"). InnerVolumeSpecName "kube-api-access-sg5s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.226261 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-scripts" (OuterVolumeSpecName: "scripts") pod "ab985863-4ac9-44bc-977a-241abc2c4635" (UID: "ab985863-4ac9-44bc-977a-241abc2c4635"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.226327 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ab985863-4ac9-44bc-977a-241abc2c4635" (UID: "ab985863-4ac9-44bc-977a-241abc2c4635"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.287830 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab985863-4ac9-44bc-977a-241abc2c4635" (UID: "ab985863-4ac9-44bc-977a-241abc2c4635"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.323484 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg5s7\" (UniqueName: \"kubernetes.io/projected/ab985863-4ac9-44bc-977a-241abc2c4635-kube-api-access-sg5s7\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.323519 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.323528 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.323540 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.355956 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-config-data" (OuterVolumeSpecName: "config-data") pod "ab985863-4ac9-44bc-977a-241abc2c4635" (UID: "ab985863-4ac9-44bc-977a-241abc2c4635"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.424868 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab985863-4ac9-44bc-977a-241abc2c4635-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.737506 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.738942 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-799c89c95d-bzssk" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.837560 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ab985863-4ac9-44bc-977a-241abc2c4635","Type":"ContainerDied","Data":"3c40559d361821195be188df0e526a4ef8b23401f622d98ae4729d1314bd6c97"} Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.838811 4835 scope.go:117] "RemoveContainer" containerID="f8c4f7746c74ef457dac60a74b3dae6438722672394bf99df6697234fecafa91" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.839053 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.865744 4835 scope.go:117] "RemoveContainer" containerID="8496edcaf7deb698b14d7a65627efb629e03abae511efc15bc13a13f89f2df07" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.897247 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.909209 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.929022 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 18:33:24 crc kubenswrapper[4835]: E1003 18:33:24.929453 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab985863-4ac9-44bc-977a-241abc2c4635" containerName="probe" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.929464 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab985863-4ac9-44bc-977a-241abc2c4635" containerName="probe" Oct 03 18:33:24 crc kubenswrapper[4835]: E1003 18:33:24.929476 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab985863-4ac9-44bc-977a-241abc2c4635" containerName="cinder-scheduler" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.929481 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab985863-4ac9-44bc-977a-241abc2c4635" containerName="cinder-scheduler" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.929654 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab985863-4ac9-44bc-977a-241abc2c4635" containerName="probe" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.929685 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab985863-4ac9-44bc-977a-241abc2c4635" containerName="cinder-scheduler" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.938703 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.946327 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 18:33:24 crc kubenswrapper[4835]: I1003 18:33:24.958655 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.050154 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-config-data\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.050439 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.050552 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.050643 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.050755 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfbx7\" (UniqueName: \"kubernetes.io/projected/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-kube-api-access-kfbx7\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.050846 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-scripts\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.151651 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-config-data\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.151955 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.152116 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.152219 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.152318 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfbx7\" (UniqueName: \"kubernetes.io/projected/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-kube-api-access-kfbx7\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.152395 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-scripts\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.153503 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.170929 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.171332 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-scripts\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.171568 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.172259 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-config-data\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.174739 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfbx7\" (UniqueName: \"kubernetes.io/projected/b57fa1fd-eb1f-4c40-a153-4e6f48698ab8-kube-api-access-kfbx7\") pod \"cinder-scheduler-0\" (UID: \"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8\") " pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.276378 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.759432 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 18:33:25 crc kubenswrapper[4835]: I1003 18:33:25.851194 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8","Type":"ContainerStarted","Data":"330879d3d23908ea5d2ea81720a099544199fe3ef759e328f10ee5c38d410ec9"} Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.046299 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5c4db54587-knmn7" Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.778612 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.780180 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.787780 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.788403 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-9bxjt" Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.788550 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.788573 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.867436 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8","Type":"ContainerStarted","Data":"67c3352385b0449ea9e48e6e07cf22a2733e0a7a2348058f73258fb3934b0d73"} Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.894157 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d72a4931-5c9f-4e46-a8b1-5f0f07b4c643-openstack-config-secret\") pod \"openstackclient\" (UID: \"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643\") " pod="openstack/openstackclient" Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.894200 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72a4931-5c9f-4e46-a8b1-5f0f07b4c643-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643\") " pod="openstack/openstackclient" Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.894238 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d72a4931-5c9f-4e46-a8b1-5f0f07b4c643-openstack-config\") pod \"openstackclient\" (UID: \"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643\") " pod="openstack/openstackclient" Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.894916 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqq2l\" (UniqueName: \"kubernetes.io/projected/d72a4931-5c9f-4e46-a8b1-5f0f07b4c643-kube-api-access-vqq2l\") pod \"openstackclient\" (UID: \"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643\") " pod="openstack/openstackclient" Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.902701 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab985863-4ac9-44bc-977a-241abc2c4635" path="/var/lib/kubelet/pods/ab985863-4ac9-44bc-977a-241abc2c4635/volumes" Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.997060 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqq2l\" (UniqueName: \"kubernetes.io/projected/d72a4931-5c9f-4e46-a8b1-5f0f07b4c643-kube-api-access-vqq2l\") pod \"openstackclient\" (UID: \"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643\") " pod="openstack/openstackclient" Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.997134 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d72a4931-5c9f-4e46-a8b1-5f0f07b4c643-openstack-config-secret\") pod \"openstackclient\" (UID: \"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643\") " pod="openstack/openstackclient" Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.997159 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72a4931-5c9f-4e46-a8b1-5f0f07b4c643-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643\") " pod="openstack/openstackclient" Oct 03 18:33:26 crc kubenswrapper[4835]: I1003 18:33:26.997185 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d72a4931-5c9f-4e46-a8b1-5f0f07b4c643-openstack-config\") pod \"openstackclient\" (UID: \"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643\") " pod="openstack/openstackclient" Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.005610 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d72a4931-5c9f-4e46-a8b1-5f0f07b4c643-openstack-config\") pod \"openstackclient\" (UID: \"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643\") " pod="openstack/openstackclient" Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.007593 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d72a4931-5c9f-4e46-a8b1-5f0f07b4c643-openstack-config-secret\") pod \"openstackclient\" (UID: \"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643\") " pod="openstack/openstackclient" Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.020581 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqq2l\" (UniqueName: \"kubernetes.io/projected/d72a4931-5c9f-4e46-a8b1-5f0f07b4c643-kube-api-access-vqq2l\") pod \"openstackclient\" (UID: \"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643\") " pod="openstack/openstackclient" Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.023305 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72a4931-5c9f-4e46-a8b1-5f0f07b4c643-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643\") " pod="openstack/openstackclient" Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.034706 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.111566 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.331312 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.389589 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d487cf869-fnqtk"] Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.389807 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" podUID="d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" containerName="dnsmasq-dns" containerID="cri-o://bf152e40061f253da96c8f28442adf39bf245f9ead88ea848009341bbc55e28f" gracePeriod=10 Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.702156 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.904494 4835 generic.go:334] "Generic (PLEG): container finished" podID="d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" containerID="bf152e40061f253da96c8f28442adf39bf245f9ead88ea848009341bbc55e28f" exitCode=0 Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.904691 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" event={"ID":"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01","Type":"ContainerDied","Data":"bf152e40061f253da96c8f28442adf39bf245f9ead88ea848009341bbc55e28f"} Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.915707 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b57fa1fd-eb1f-4c40-a153-4e6f48698ab8","Type":"ContainerStarted","Data":"ad97bb935772edd30e2642cd1e7251281717b17e645effaf32d6801580afbff0"} Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.918984 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643","Type":"ContainerStarted","Data":"9fa09181c6706d7955aa50afbd8d2c80e23076288680ada67f4d10ed68b482e7"} Oct 03 18:33:27 crc kubenswrapper[4835]: I1003 18:33:27.960805 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.960787571 podStartE2EDuration="3.960787571s" podCreationTimestamp="2025-10-03 18:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:33:27.951392483 +0000 UTC m=+1149.667333355" watchObservedRunningTime="2025-10-03 18:33:27.960787571 +0000 UTC m=+1149.676728443" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.057380 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.152675 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-dns-svc\") pod \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.152832 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zprm9\" (UniqueName: \"kubernetes.io/projected/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-kube-api-access-zprm9\") pod \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.152872 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-config\") pod \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.152909 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-ovsdbserver-nb\") pod \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.152957 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-ovsdbserver-sb\") pod \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.153048 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-dns-swift-storage-0\") pod \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\" (UID: \"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01\") " Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.164339 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-kube-api-access-zprm9" (OuterVolumeSpecName: "kube-api-access-zprm9") pod "d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" (UID: "d74b6af7-4f28-432b-b2bb-b2a6b4bccf01"). InnerVolumeSpecName "kube-api-access-zprm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.209263 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" (UID: "d74b6af7-4f28-432b-b2bb-b2a6b4bccf01"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.231423 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-config" (OuterVolumeSpecName: "config") pod "d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" (UID: "d74b6af7-4f28-432b-b2bb-b2a6b4bccf01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.232728 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" (UID: "d74b6af7-4f28-432b-b2bb-b2a6b4bccf01"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.241473 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" (UID: "d74b6af7-4f28-432b-b2bb-b2a6b4bccf01"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.250805 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" (UID: "d74b6af7-4f28-432b-b2bb-b2a6b4bccf01"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.255320 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.255350 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.255359 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zprm9\" (UniqueName: \"kubernetes.io/projected/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-kube-api-access-zprm9\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.255369 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.255378 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.255386 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.934923 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.935835 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d487cf869-fnqtk" event={"ID":"d74b6af7-4f28-432b-b2bb-b2a6b4bccf01","Type":"ContainerDied","Data":"7c90bb4998a05feeeaaf7f7281c787361109edd31fa69728b7dd874cc299c107"} Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.935908 4835 scope.go:117] "RemoveContainer" containerID="bf152e40061f253da96c8f28442adf39bf245f9ead88ea848009341bbc55e28f" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.985167 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d487cf869-fnqtk"] Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.989206 4835 scope.go:117] "RemoveContainer" containerID="25d2683dbee2ecf278f809adb920be370f8f8d45740841d4468b8bbcb695192a" Oct 03 18:33:28 crc kubenswrapper[4835]: I1003 18:33:28.999719 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d487cf869-fnqtk"] Oct 03 18:33:30 crc kubenswrapper[4835]: I1003 18:33:30.276843 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 18:33:30 crc kubenswrapper[4835]: I1003 18:33:30.743629 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64dcfd48b6-tpcpd" podUID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.165:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.165:8443: connect: connection refused" Oct 03 18:33:30 crc kubenswrapper[4835]: I1003 18:33:30.743748 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:33:30 crc kubenswrapper[4835]: I1003 18:33:30.892270 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" path="/var/lib/kubelet/pods/d74b6af7-4f28-432b-b2bb-b2a6b4bccf01/volumes" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.296272 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-75bfc94c9f-srgmb"] Oct 03 18:33:32 crc kubenswrapper[4835]: E1003 18:33:32.296995 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" containerName="dnsmasq-dns" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.297007 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" containerName="dnsmasq-dns" Oct 03 18:33:32 crc kubenswrapper[4835]: E1003 18:33:32.297044 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" containerName="init" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.297051 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" containerName="init" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.297246 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d74b6af7-4f28-432b-b2bb-b2a6b4bccf01" containerName="dnsmasq-dns" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.298373 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.301051 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.305465 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.306406 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.318643 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75bfc94c9f-srgmb"] Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.351146 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4qnl\" (UniqueName: \"kubernetes.io/projected/f75702cb-25b4-45f5-a26f-0867f10dc525-kube-api-access-x4qnl\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.351203 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f75702cb-25b4-45f5-a26f-0867f10dc525-etc-swift\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.351224 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f75702cb-25b4-45f5-a26f-0867f10dc525-internal-tls-certs\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.351302 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75702cb-25b4-45f5-a26f-0867f10dc525-log-httpd\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.351321 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f75702cb-25b4-45f5-a26f-0867f10dc525-public-tls-certs\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.351337 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f75702cb-25b4-45f5-a26f-0867f10dc525-combined-ca-bundle\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.351384 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75702cb-25b4-45f5-a26f-0867f10dc525-run-httpd\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.351419 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f75702cb-25b4-45f5-a26f-0867f10dc525-config-data\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.453507 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f75702cb-25b4-45f5-a26f-0867f10dc525-etc-swift\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.453546 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f75702cb-25b4-45f5-a26f-0867f10dc525-internal-tls-certs\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.453621 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75702cb-25b4-45f5-a26f-0867f10dc525-log-httpd\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.453640 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f75702cb-25b4-45f5-a26f-0867f10dc525-public-tls-certs\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.453662 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f75702cb-25b4-45f5-a26f-0867f10dc525-combined-ca-bundle\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.453707 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75702cb-25b4-45f5-a26f-0867f10dc525-run-httpd\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.453743 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f75702cb-25b4-45f5-a26f-0867f10dc525-config-data\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.453782 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4qnl\" (UniqueName: \"kubernetes.io/projected/f75702cb-25b4-45f5-a26f-0867f10dc525-kube-api-access-x4qnl\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.454665 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75702cb-25b4-45f5-a26f-0867f10dc525-log-httpd\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.454703 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75702cb-25b4-45f5-a26f-0867f10dc525-run-httpd\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.460809 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f75702cb-25b4-45f5-a26f-0867f10dc525-combined-ca-bundle\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.461016 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f75702cb-25b4-45f5-a26f-0867f10dc525-public-tls-certs\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.461024 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f75702cb-25b4-45f5-a26f-0867f10dc525-etc-swift\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.461610 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f75702cb-25b4-45f5-a26f-0867f10dc525-internal-tls-certs\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.464533 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f75702cb-25b4-45f5-a26f-0867f10dc525-config-data\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.478403 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4qnl\" (UniqueName: \"kubernetes.io/projected/f75702cb-25b4-45f5-a26f-0867f10dc525-kube-api-access-x4qnl\") pod \"swift-proxy-75bfc94c9f-srgmb\" (UID: \"f75702cb-25b4-45f5-a26f-0867f10dc525\") " pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.601758 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.602110 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="ceilometer-central-agent" containerID="cri-o://f47b2382957676c586d20de83df106f0864dd9fcc944854a455b9bf25e1b8160" gracePeriod=30 Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.602383 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="proxy-httpd" containerID="cri-o://8a4226ecb007fd3ecbc03cb2ea96919d03f62135876d60e5152d7cfa3d1763e3" gracePeriod=30 Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.602451 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="sg-core" containerID="cri-o://0f6d996612c55cd95822a881c0a094bac2d7d280e2f7fdfbf163ace4f5e5fa01" gracePeriod=30 Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.602494 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="ceilometer-notification-agent" containerID="cri-o://34da65470a92bff1873d6194522087c396885e0763a63c540d1699999c59c7d5" gracePeriod=30 Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.618760 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.624686 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.182:3000/\": EOF" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.877190 4835 scope.go:117] "RemoveContainer" containerID="93a43d0e5cc9cf9fc0d017857367c8f6de72ed7cb5c9b28e99b416df23f0b755" Oct 03 18:33:32 crc kubenswrapper[4835]: E1003 18:33:32.877702 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8110d0e5-9e19-4306-b8aa-babe937e8d2a)\"" pod="openstack/watcher-decision-engine-0" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.987952 4835 generic.go:334] "Generic (PLEG): container finished" podID="c7762654-21e9-4999-b701-7498146153b2" containerID="8a4226ecb007fd3ecbc03cb2ea96919d03f62135876d60e5152d7cfa3d1763e3" exitCode=0 Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.987986 4835 generic.go:334] "Generic (PLEG): container finished" podID="c7762654-21e9-4999-b701-7498146153b2" containerID="0f6d996612c55cd95822a881c0a094bac2d7d280e2f7fdfbf163ace4f5e5fa01" exitCode=2 Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.988006 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7762654-21e9-4999-b701-7498146153b2","Type":"ContainerDied","Data":"8a4226ecb007fd3ecbc03cb2ea96919d03f62135876d60e5152d7cfa3d1763e3"} Oct 03 18:33:32 crc kubenswrapper[4835]: I1003 18:33:32.988032 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7762654-21e9-4999-b701-7498146153b2","Type":"ContainerDied","Data":"0f6d996612c55cd95822a881c0a094bac2d7d280e2f7fdfbf163ace4f5e5fa01"} Oct 03 18:33:34 crc kubenswrapper[4835]: I1003 18:33:34.009757 4835 generic.go:334] "Generic (PLEG): container finished" podID="c7762654-21e9-4999-b701-7498146153b2" containerID="34da65470a92bff1873d6194522087c396885e0763a63c540d1699999c59c7d5" exitCode=0 Oct 03 18:33:34 crc kubenswrapper[4835]: I1003 18:33:34.009815 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7762654-21e9-4999-b701-7498146153b2","Type":"ContainerDied","Data":"34da65470a92bff1873d6194522087c396885e0763a63c540d1699999c59c7d5"} Oct 03 18:33:34 crc kubenswrapper[4835]: I1003 18:33:34.009851 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7762654-21e9-4999-b701-7498146153b2","Type":"ContainerDied","Data":"f47b2382957676c586d20de83df106f0864dd9fcc944854a455b9bf25e1b8160"} Oct 03 18:33:34 crc kubenswrapper[4835]: I1003 18:33:34.009790 4835 generic.go:334] "Generic (PLEG): container finished" podID="c7762654-21e9-4999-b701-7498146153b2" containerID="f47b2382957676c586d20de83df106f0864dd9fcc944854a455b9bf25e1b8160" exitCode=0 Oct 03 18:33:35 crc kubenswrapper[4835]: I1003 18:33:35.361000 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:33:35 crc kubenswrapper[4835]: I1003 18:33:35.361322 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:33:35 crc kubenswrapper[4835]: I1003 18:33:35.361363 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:33:35 crc kubenswrapper[4835]: I1003 18:33:35.362049 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5001daf2345cdee7613b40d98138459acb007dbdcb955c73ad790b203e897d4f"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 18:33:35 crc kubenswrapper[4835]: I1003 18:33:35.362117 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://5001daf2345cdee7613b40d98138459acb007dbdcb955c73ad790b203e897d4f" gracePeriod=600 Oct 03 18:33:35 crc kubenswrapper[4835]: E1003 18:33:35.441928 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10a8b8e7_c0f5_4c40_b0bd_b52379adae1f.slice/crio-conmon-5001daf2345cdee7613b40d98138459acb007dbdcb955c73ad790b203e897d4f.scope\": RecentStats: unable to find data in memory cache]" Oct 03 18:33:35 crc kubenswrapper[4835]: I1003 18:33:35.481186 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.030844 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="5001daf2345cdee7613b40d98138459acb007dbdcb955c73ad790b203e897d4f" exitCode=0 Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.031062 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"5001daf2345cdee7613b40d98138459acb007dbdcb955c73ad790b203e897d4f"} Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.031204 4835 scope.go:117] "RemoveContainer" containerID="fc96018384aa8860a4c2fcec8a03cef5fa41451e8751027f47a38b13cdf1722b" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.034406 4835 generic.go:334] "Generic (PLEG): container finished" podID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerID="c98cca13a795e8c49418d2713ce1f52d7c5add2a8a45469c7d0d5ca2a4207bec" exitCode=137 Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.034435 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64dcfd48b6-tpcpd" event={"ID":"de5d465a-f009-4cef-940e-3b2aaa64468b","Type":"ContainerDied","Data":"c98cca13a795e8c49418d2713ce1f52d7c5add2a8a45469c7d0d5ca2a4207bec"} Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.334124 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xw7fm"] Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.340776 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xw7fm" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.346230 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xw7fm"] Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.528714 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkfd4\" (UniqueName: \"kubernetes.io/projected/d39d05ac-72e8-4449-8b55-1b4126c64554-kube-api-access-xkfd4\") pod \"nova-api-db-create-xw7fm\" (UID: \"d39d05ac-72e8-4449-8b55-1b4126c64554\") " pod="openstack/nova-api-db-create-xw7fm" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.546928 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-cw6qt"] Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.548121 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-cw6qt" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.561375 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-cw6qt"] Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.631777 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clwhs\" (UniqueName: \"kubernetes.io/projected/fc2b577b-2a9c-4651-95f2-ad815b073b61-kube-api-access-clwhs\") pod \"nova-cell0-db-create-cw6qt\" (UID: \"fc2b577b-2a9c-4651-95f2-ad815b073b61\") " pod="openstack/nova-cell0-db-create-cw6qt" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.631865 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkfd4\" (UniqueName: \"kubernetes.io/projected/d39d05ac-72e8-4449-8b55-1b4126c64554-kube-api-access-xkfd4\") pod \"nova-api-db-create-xw7fm\" (UID: \"d39d05ac-72e8-4449-8b55-1b4126c64554\") " pod="openstack/nova-api-db-create-xw7fm" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.631778 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-k2fcn"] Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.633271 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k2fcn" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.642509 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k2fcn"] Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.672838 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkfd4\" (UniqueName: \"kubernetes.io/projected/d39d05ac-72e8-4449-8b55-1b4126c64554-kube-api-access-xkfd4\") pod \"nova-api-db-create-xw7fm\" (UID: \"d39d05ac-72e8-4449-8b55-1b4126c64554\") " pod="openstack/nova-api-db-create-xw7fm" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.733852 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clwhs\" (UniqueName: \"kubernetes.io/projected/fc2b577b-2a9c-4651-95f2-ad815b073b61-kube-api-access-clwhs\") pod \"nova-cell0-db-create-cw6qt\" (UID: \"fc2b577b-2a9c-4651-95f2-ad815b073b61\") " pod="openstack/nova-cell0-db-create-cw6qt" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.734247 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdmg4\" (UniqueName: \"kubernetes.io/projected/31632d24-3d3f-438c-a447-1a38f58ac87b-kube-api-access-wdmg4\") pod \"nova-cell1-db-create-k2fcn\" (UID: \"31632d24-3d3f-438c-a447-1a38f58ac87b\") " pod="openstack/nova-cell1-db-create-k2fcn" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.755330 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clwhs\" (UniqueName: \"kubernetes.io/projected/fc2b577b-2a9c-4651-95f2-ad815b073b61-kube-api-access-clwhs\") pod \"nova-cell0-db-create-cw6qt\" (UID: \"fc2b577b-2a9c-4651-95f2-ad815b073b61\") " pod="openstack/nova-cell0-db-create-cw6qt" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.835640 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdmg4\" (UniqueName: \"kubernetes.io/projected/31632d24-3d3f-438c-a447-1a38f58ac87b-kube-api-access-wdmg4\") pod \"nova-cell1-db-create-k2fcn\" (UID: \"31632d24-3d3f-438c-a447-1a38f58ac87b\") " pod="openstack/nova-cell1-db-create-k2fcn" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.856280 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdmg4\" (UniqueName: \"kubernetes.io/projected/31632d24-3d3f-438c-a447-1a38f58ac87b-kube-api-access-wdmg4\") pod \"nova-cell1-db-create-k2fcn\" (UID: \"31632d24-3d3f-438c-a447-1a38f58ac87b\") " pod="openstack/nova-cell1-db-create-k2fcn" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.879081 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-cw6qt" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.958558 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k2fcn" Oct 03 18:33:36 crc kubenswrapper[4835]: I1003 18:33:36.959485 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xw7fm" Oct 03 18:33:38 crc kubenswrapper[4835]: I1003 18:33:38.166547 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 18:33:38 crc kubenswrapper[4835]: I1003 18:33:38.166924 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 03 18:33:38 crc kubenswrapper[4835]: I1003 18:33:38.167483 4835 scope.go:117] "RemoveContainer" containerID="93a43d0e5cc9cf9fc0d017857367c8f6de72ed7cb5c9b28e99b416df23f0b755" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.088880 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8110d0e5-9e19-4306-b8aa-babe937e8d2a","Type":"ContainerStarted","Data":"97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c"} Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.219429 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.346313 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.387828 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-config-data\") pod \"c7762654-21e9-4999-b701-7498146153b2\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.388112 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-combined-ca-bundle\") pod \"c7762654-21e9-4999-b701-7498146153b2\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.388352 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-scripts\") pod \"c7762654-21e9-4999-b701-7498146153b2\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.388419 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7762654-21e9-4999-b701-7498146153b2-run-httpd\") pod \"c7762654-21e9-4999-b701-7498146153b2\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.388466 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdjmn\" (UniqueName: \"kubernetes.io/projected/c7762654-21e9-4999-b701-7498146153b2-kube-api-access-bdjmn\") pod \"c7762654-21e9-4999-b701-7498146153b2\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.388494 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-sg-core-conf-yaml\") pod \"c7762654-21e9-4999-b701-7498146153b2\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.388532 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7762654-21e9-4999-b701-7498146153b2-log-httpd\") pod \"c7762654-21e9-4999-b701-7498146153b2\" (UID: \"c7762654-21e9-4999-b701-7498146153b2\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.389928 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7762654-21e9-4999-b701-7498146153b2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c7762654-21e9-4999-b701-7498146153b2" (UID: "c7762654-21e9-4999-b701-7498146153b2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.390390 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7762654-21e9-4999-b701-7498146153b2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c7762654-21e9-4999-b701-7498146153b2" (UID: "c7762654-21e9-4999-b701-7498146153b2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.395903 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7762654-21e9-4999-b701-7498146153b2-kube-api-access-bdjmn" (OuterVolumeSpecName: "kube-api-access-bdjmn") pod "c7762654-21e9-4999-b701-7498146153b2" (UID: "c7762654-21e9-4999-b701-7498146153b2"). InnerVolumeSpecName "kube-api-access-bdjmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.396626 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-scripts" (OuterVolumeSpecName: "scripts") pod "c7762654-21e9-4999-b701-7498146153b2" (UID: "c7762654-21e9-4999-b701-7498146153b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.454187 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c7762654-21e9-4999-b701-7498146153b2" (UID: "c7762654-21e9-4999-b701-7498146153b2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.489947 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de5d465a-f009-4cef-940e-3b2aaa64468b-config-data\") pod \"de5d465a-f009-4cef-940e-3b2aaa64468b\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.490254 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de5d465a-f009-4cef-940e-3b2aaa64468b-scripts\") pod \"de5d465a-f009-4cef-940e-3b2aaa64468b\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.490419 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-combined-ca-bundle\") pod \"de5d465a-f009-4cef-940e-3b2aaa64468b\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.490523 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twmrd\" (UniqueName: \"kubernetes.io/projected/de5d465a-f009-4cef-940e-3b2aaa64468b-kube-api-access-twmrd\") pod \"de5d465a-f009-4cef-940e-3b2aaa64468b\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.490747 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-horizon-secret-key\") pod \"de5d465a-f009-4cef-940e-3b2aaa64468b\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.490841 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de5d465a-f009-4cef-940e-3b2aaa64468b-logs\") pod \"de5d465a-f009-4cef-940e-3b2aaa64468b\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.490980 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-horizon-tls-certs\") pod \"de5d465a-f009-4cef-940e-3b2aaa64468b\" (UID: \"de5d465a-f009-4cef-940e-3b2aaa64468b\") " Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.491438 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.491559 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7762654-21e9-4999-b701-7498146153b2-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.491621 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdjmn\" (UniqueName: \"kubernetes.io/projected/c7762654-21e9-4999-b701-7498146153b2-kube-api-access-bdjmn\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.491675 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.491742 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c7762654-21e9-4999-b701-7498146153b2-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.498217 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7762654-21e9-4999-b701-7498146153b2" (UID: "c7762654-21e9-4999-b701-7498146153b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.506760 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de5d465a-f009-4cef-940e-3b2aaa64468b-logs" (OuterVolumeSpecName: "logs") pod "de5d465a-f009-4cef-940e-3b2aaa64468b" (UID: "de5d465a-f009-4cef-940e-3b2aaa64468b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.509864 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5d465a-f009-4cef-940e-3b2aaa64468b-kube-api-access-twmrd" (OuterVolumeSpecName: "kube-api-access-twmrd") pod "de5d465a-f009-4cef-940e-3b2aaa64468b" (UID: "de5d465a-f009-4cef-940e-3b2aaa64468b"). InnerVolumeSpecName "kube-api-access-twmrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.509997 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "de5d465a-f009-4cef-940e-3b2aaa64468b" (UID: "de5d465a-f009-4cef-940e-3b2aaa64468b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.540154 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de5d465a-f009-4cef-940e-3b2aaa64468b" (UID: "de5d465a-f009-4cef-940e-3b2aaa64468b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.549846 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-config-data" (OuterVolumeSpecName: "config-data") pod "c7762654-21e9-4999-b701-7498146153b2" (UID: "c7762654-21e9-4999-b701-7498146153b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.559642 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "de5d465a-f009-4cef-940e-3b2aaa64468b" (UID: "de5d465a-f009-4cef-940e-3b2aaa64468b"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.566142 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de5d465a-f009-4cef-940e-3b2aaa64468b-config-data" (OuterVolumeSpecName: "config-data") pod "de5d465a-f009-4cef-940e-3b2aaa64468b" (UID: "de5d465a-f009-4cef-940e-3b2aaa64468b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.568266 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de5d465a-f009-4cef-940e-3b2aaa64468b-scripts" (OuterVolumeSpecName: "scripts") pod "de5d465a-f009-4cef-940e-3b2aaa64468b" (UID: "de5d465a-f009-4cef-940e-3b2aaa64468b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.593751 4835 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.593778 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/de5d465a-f009-4cef-940e-3b2aaa64468b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.593788 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de5d465a-f009-4cef-940e-3b2aaa64468b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.593797 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.593805 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.593813 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twmrd\" (UniqueName: \"kubernetes.io/projected/de5d465a-f009-4cef-940e-3b2aaa64468b-kube-api-access-twmrd\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.593823 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7762654-21e9-4999-b701-7498146153b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.593832 4835 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/de5d465a-f009-4cef-940e-3b2aaa64468b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.593841 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de5d465a-f009-4cef-940e-3b2aaa64468b-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.659348 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xw7fm"] Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.729268 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-cw6qt"] Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.750822 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k2fcn"] Oct 03 18:33:39 crc kubenswrapper[4835]: W1003 18:33:39.802275 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31632d24_3d3f_438c_a447_1a38f58ac87b.slice/crio-0e23fddb5e9ffc4818477e7620aeb18eed64dacd0739de82c4d7cccba705d777 WatchSource:0}: Error finding container 0e23fddb5e9ffc4818477e7620aeb18eed64dacd0739de82c4d7cccba705d777: Status 404 returned error can't find the container with id 0e23fddb5e9ffc4818477e7620aeb18eed64dacd0739de82c4d7cccba705d777 Oct 03 18:33:39 crc kubenswrapper[4835]: I1003 18:33:39.902290 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75bfc94c9f-srgmb"] Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.135532 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c7762654-21e9-4999-b701-7498146153b2","Type":"ContainerDied","Data":"85d82407301e02fdd5f71767b6b4a24c753806dd73ab2bf21a836e46f2cb1c9e"} Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.135865 4835 scope.go:117] "RemoveContainer" containerID="8a4226ecb007fd3ecbc03cb2ea96919d03f62135876d60e5152d7cfa3d1763e3" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.135777 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.155575 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75bfc94c9f-srgmb" event={"ID":"f75702cb-25b4-45f5-a26f-0867f10dc525","Type":"ContainerStarted","Data":"dc3a29e2b488be4c1c48d794829ee52ce3f578d536a8ec4dbbd9dc4a17b8d656"} Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.157476 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64dcfd48b6-tpcpd" event={"ID":"de5d465a-f009-4cef-940e-3b2aaa64468b","Type":"ContainerDied","Data":"56c851691059668dc7e722bad63f58ebc407d51feb1ac1b597a2a0e0b6897174"} Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.157594 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64dcfd48b6-tpcpd" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.163012 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k2fcn" event={"ID":"31632d24-3d3f-438c-a447-1a38f58ac87b","Type":"ContainerStarted","Data":"0e23fddb5e9ffc4818477e7620aeb18eed64dacd0739de82c4d7cccba705d777"} Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.166222 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-cw6qt" event={"ID":"fc2b577b-2a9c-4651-95f2-ad815b073b61","Type":"ContainerStarted","Data":"d8fb835afc7538bbd3ba58671173aadcd9dd3daa9416005f35d22925b18560e9"} Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.170793 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xw7fm" event={"ID":"d39d05ac-72e8-4449-8b55-1b4126c64554","Type":"ContainerStarted","Data":"ec3ca880005d9ed7915e1c75229ced225a4c5267c894fd6d3a4ace9ee481e0e2"} Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.177061 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d72a4931-5c9f-4e46-a8b1-5f0f07b4c643","Type":"ContainerStarted","Data":"284985e42c38680d8dfd83e810dc25c79347f157dc527818af95370d62f1bba6"} Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.201162 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"f6256e41e920d222ccf54930d399b21cd032b6d7ace88624e5e3fa3510d642ea"} Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.211443 4835 scope.go:117] "RemoveContainer" containerID="0f6d996612c55cd95822a881c0a094bac2d7d280e2f7fdfbf163ace4f5e5fa01" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.228822 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.222872711 podStartE2EDuration="14.228806992s" podCreationTimestamp="2025-10-03 18:33:26 +0000 UTC" firstStartedPulling="2025-10-03 18:33:27.730156331 +0000 UTC m=+1149.446097193" lastFinishedPulling="2025-10-03 18:33:38.736090602 +0000 UTC m=+1160.452031474" observedRunningTime="2025-10-03 18:33:40.207209717 +0000 UTC m=+1161.923150589" watchObservedRunningTime="2025-10-03 18:33:40.228806992 +0000 UTC m=+1161.944747864" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.284272 4835 scope.go:117] "RemoveContainer" containerID="34da65470a92bff1873d6194522087c396885e0763a63c540d1699999c59c7d5" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.296832 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.317646 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.332253 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64dcfd48b6-tpcpd"] Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.340322 4835 scope.go:117] "RemoveContainer" containerID="f47b2382957676c586d20de83df106f0864dd9fcc944854a455b9bf25e1b8160" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.355229 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-64dcfd48b6-tpcpd"] Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365007 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365089 4835 scope.go:117] "RemoveContainer" containerID="83e641e13eea890082718c9960c998038c38e8137ea51f5b598f75b184ca52c4" Oct 03 18:33:40 crc kubenswrapper[4835]: E1003 18:33:40.365481 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerName="horizon-log" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365516 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerName="horizon-log" Oct 03 18:33:40 crc kubenswrapper[4835]: E1003 18:33:40.365544 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="ceilometer-central-agent" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365551 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="ceilometer-central-agent" Oct 03 18:33:40 crc kubenswrapper[4835]: E1003 18:33:40.365563 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="proxy-httpd" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365571 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="proxy-httpd" Oct 03 18:33:40 crc kubenswrapper[4835]: E1003 18:33:40.365595 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="sg-core" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365602 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="sg-core" Oct 03 18:33:40 crc kubenswrapper[4835]: E1003 18:33:40.365616 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerName="horizon" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365623 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerName="horizon" Oct 03 18:33:40 crc kubenswrapper[4835]: E1003 18:33:40.365644 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="ceilometer-notification-agent" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365651 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="ceilometer-notification-agent" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365867 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="ceilometer-central-agent" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365888 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerName="horizon-log" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365897 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="sg-core" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365917 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="de5d465a-f009-4cef-940e-3b2aaa64468b" containerName="horizon" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365928 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="proxy-httpd" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.365941 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7762654-21e9-4999-b701-7498146153b2" containerName="ceilometer-notification-agent" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.368708 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.370954 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.371486 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.374366 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.527663 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef0348-5bc7-4502-852d-20ad6663b871-log-httpd\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.527718 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.527791 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-config-data\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.527850 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef0348-5bc7-4502-852d-20ad6663b871-run-httpd\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.527909 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.527977 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-scripts\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.528011 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgmfb\" (UniqueName: \"kubernetes.io/projected/4bef0348-5bc7-4502-852d-20ad6663b871-kube-api-access-zgmfb\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.629583 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.629664 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-scripts\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.629692 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgmfb\" (UniqueName: \"kubernetes.io/projected/4bef0348-5bc7-4502-852d-20ad6663b871-kube-api-access-zgmfb\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.629743 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef0348-5bc7-4502-852d-20ad6663b871-log-httpd\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.629760 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.629803 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-config-data\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.629838 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef0348-5bc7-4502-852d-20ad6663b871-run-httpd\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.630286 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef0348-5bc7-4502-852d-20ad6663b871-run-httpd\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.630495 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef0348-5bc7-4502-852d-20ad6663b871-log-httpd\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.635888 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.636165 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-scripts\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.638829 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-config-data\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.653902 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.654558 4835 scope.go:117] "RemoveContainer" containerID="c98cca13a795e8c49418d2713ce1f52d7c5add2a8a45469c7d0d5ca2a4207bec" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.658138 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgmfb\" (UniqueName: \"kubernetes.io/projected/4bef0348-5bc7-4502-852d-20ad6663b871-kube-api-access-zgmfb\") pod \"ceilometer-0\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.742480 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.914893 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7762654-21e9-4999-b701-7498146153b2" path="/var/lib/kubelet/pods/c7762654-21e9-4999-b701-7498146153b2/volumes" Oct 03 18:33:40 crc kubenswrapper[4835]: I1003 18:33:40.915693 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de5d465a-f009-4cef-940e-3b2aaa64468b" path="/var/lib/kubelet/pods/de5d465a-f009-4cef-940e-3b2aaa64468b/volumes" Oct 03 18:33:41 crc kubenswrapper[4835]: I1003 18:33:41.210536 4835 generic.go:334] "Generic (PLEG): container finished" podID="d39d05ac-72e8-4449-8b55-1b4126c64554" containerID="4e511619ab70773b888b62bab0bc38f6388687b9ed0b22698077a37c6edcb38f" exitCode=0 Oct 03 18:33:41 crc kubenswrapper[4835]: I1003 18:33:41.210572 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xw7fm" event={"ID":"d39d05ac-72e8-4449-8b55-1b4126c64554","Type":"ContainerDied","Data":"4e511619ab70773b888b62bab0bc38f6388687b9ed0b22698077a37c6edcb38f"} Oct 03 18:33:41 crc kubenswrapper[4835]: I1003 18:33:41.213630 4835 generic.go:334] "Generic (PLEG): container finished" podID="31632d24-3d3f-438c-a447-1a38f58ac87b" containerID="dc000fb4deab8ffabde1216d155dc0aea2824f8bf6e830491125bb95ba339aa4" exitCode=0 Oct 03 18:33:41 crc kubenswrapper[4835]: I1003 18:33:41.213690 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k2fcn" event={"ID":"31632d24-3d3f-438c-a447-1a38f58ac87b","Type":"ContainerDied","Data":"dc000fb4deab8ffabde1216d155dc0aea2824f8bf6e830491125bb95ba339aa4"} Oct 03 18:33:41 crc kubenswrapper[4835]: I1003 18:33:41.217729 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75bfc94c9f-srgmb" event={"ID":"f75702cb-25b4-45f5-a26f-0867f10dc525","Type":"ContainerStarted","Data":"3cd2073851e48bcc521b4f30aa70b09a2f7296312f9df48b1136d2555900aa6a"} Oct 03 18:33:41 crc kubenswrapper[4835]: I1003 18:33:41.217768 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75bfc94c9f-srgmb" event={"ID":"f75702cb-25b4-45f5-a26f-0867f10dc525","Type":"ContainerStarted","Data":"ca1ee15eb7c171eae7d7f7de4bc08c4177ede302ecc74cf8ae2aea418d7cfe38"} Oct 03 18:33:41 crc kubenswrapper[4835]: I1003 18:33:41.218898 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:41 crc kubenswrapper[4835]: I1003 18:33:41.218956 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:41 crc kubenswrapper[4835]: I1003 18:33:41.220543 4835 generic.go:334] "Generic (PLEG): container finished" podID="fc2b577b-2a9c-4651-95f2-ad815b073b61" containerID="71c8fdd79207970fbc9654e70c23a172a47f0fdab08777e00c655f5282cbb845" exitCode=0 Oct 03 18:33:41 crc kubenswrapper[4835]: I1003 18:33:41.221150 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-cw6qt" event={"ID":"fc2b577b-2a9c-4651-95f2-ad815b073b61","Type":"ContainerDied","Data":"71c8fdd79207970fbc9654e70c23a172a47f0fdab08777e00c655f5282cbb845"} Oct 03 18:33:41 crc kubenswrapper[4835]: I1003 18:33:41.268164 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:41 crc kubenswrapper[4835]: W1003 18:33:41.270910 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bef0348_5bc7_4502_852d_20ad6663b871.slice/crio-90ec8ed2e3e95220ad8ed45afd6f50c6bf16b2e03f5e1a1e848e943c9baacb0a WatchSource:0}: Error finding container 90ec8ed2e3e95220ad8ed45afd6f50c6bf16b2e03f5e1a1e848e943c9baacb0a: Status 404 returned error can't find the container with id 90ec8ed2e3e95220ad8ed45afd6f50c6bf16b2e03f5e1a1e848e943c9baacb0a Oct 03 18:33:41 crc kubenswrapper[4835]: I1003 18:33:41.300688 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-75bfc94c9f-srgmb" podStartSLOduration=9.300670055 podStartE2EDuration="9.300670055s" podCreationTimestamp="2025-10-03 18:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:33:41.281358036 +0000 UTC m=+1162.997298908" watchObservedRunningTime="2025-10-03 18:33:41.300670055 +0000 UTC m=+1163.016610927" Oct 03 18:33:42 crc kubenswrapper[4835]: I1003 18:33:42.232788 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef0348-5bc7-4502-852d-20ad6663b871","Type":"ContainerStarted","Data":"2c3b0be003cc2a1b33d30879456544195864250b81be6d690b4b419f2a146cf1"} Oct 03 18:33:42 crc kubenswrapper[4835]: I1003 18:33:42.235353 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef0348-5bc7-4502-852d-20ad6663b871","Type":"ContainerStarted","Data":"f6bcffd6a8d3a25ec1e2fe103a4b6fa1764d15b46f884f1814a4ba07682118cc"} Oct 03 18:33:42 crc kubenswrapper[4835]: I1003 18:33:42.235432 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef0348-5bc7-4502-852d-20ad6663b871","Type":"ContainerStarted","Data":"90ec8ed2e3e95220ad8ed45afd6f50c6bf16b2e03f5e1a1e848e943c9baacb0a"} Oct 03 18:33:42 crc kubenswrapper[4835]: I1003 18:33:42.931338 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k2fcn" Oct 03 18:33:42 crc kubenswrapper[4835]: I1003 18:33:42.940865 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xw7fm" Oct 03 18:33:42 crc kubenswrapper[4835]: I1003 18:33:42.948661 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-cw6qt" Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.081117 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkfd4\" (UniqueName: \"kubernetes.io/projected/d39d05ac-72e8-4449-8b55-1b4126c64554-kube-api-access-xkfd4\") pod \"d39d05ac-72e8-4449-8b55-1b4126c64554\" (UID: \"d39d05ac-72e8-4449-8b55-1b4126c64554\") " Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.081162 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdmg4\" (UniqueName: \"kubernetes.io/projected/31632d24-3d3f-438c-a447-1a38f58ac87b-kube-api-access-wdmg4\") pod \"31632d24-3d3f-438c-a447-1a38f58ac87b\" (UID: \"31632d24-3d3f-438c-a447-1a38f58ac87b\") " Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.081374 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clwhs\" (UniqueName: \"kubernetes.io/projected/fc2b577b-2a9c-4651-95f2-ad815b073b61-kube-api-access-clwhs\") pod \"fc2b577b-2a9c-4651-95f2-ad815b073b61\" (UID: \"fc2b577b-2a9c-4651-95f2-ad815b073b61\") " Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.087298 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2b577b-2a9c-4651-95f2-ad815b073b61-kube-api-access-clwhs" (OuterVolumeSpecName: "kube-api-access-clwhs") pod "fc2b577b-2a9c-4651-95f2-ad815b073b61" (UID: "fc2b577b-2a9c-4651-95f2-ad815b073b61"). InnerVolumeSpecName "kube-api-access-clwhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.100251 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d39d05ac-72e8-4449-8b55-1b4126c64554-kube-api-access-xkfd4" (OuterVolumeSpecName: "kube-api-access-xkfd4") pod "d39d05ac-72e8-4449-8b55-1b4126c64554" (UID: "d39d05ac-72e8-4449-8b55-1b4126c64554"). InnerVolumeSpecName "kube-api-access-xkfd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.108276 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31632d24-3d3f-438c-a447-1a38f58ac87b-kube-api-access-wdmg4" (OuterVolumeSpecName: "kube-api-access-wdmg4") pod "31632d24-3d3f-438c-a447-1a38f58ac87b" (UID: "31632d24-3d3f-438c-a447-1a38f58ac87b"). InnerVolumeSpecName "kube-api-access-wdmg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.183575 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clwhs\" (UniqueName: \"kubernetes.io/projected/fc2b577b-2a9c-4651-95f2-ad815b073b61-kube-api-access-clwhs\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.183609 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkfd4\" (UniqueName: \"kubernetes.io/projected/d39d05ac-72e8-4449-8b55-1b4126c64554-kube-api-access-xkfd4\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.183619 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdmg4\" (UniqueName: \"kubernetes.io/projected/31632d24-3d3f-438c-a447-1a38f58ac87b-kube-api-access-wdmg4\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.242030 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k2fcn" Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.242023 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k2fcn" event={"ID":"31632d24-3d3f-438c-a447-1a38f58ac87b","Type":"ContainerDied","Data":"0e23fddb5e9ffc4818477e7620aeb18eed64dacd0739de82c4d7cccba705d777"} Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.242190 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e23fddb5e9ffc4818477e7620aeb18eed64dacd0739de82c4d7cccba705d777" Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.243455 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-cw6qt" Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.243450 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-cw6qt" event={"ID":"fc2b577b-2a9c-4651-95f2-ad815b073b61","Type":"ContainerDied","Data":"d8fb835afc7538bbd3ba58671173aadcd9dd3daa9416005f35d22925b18560e9"} Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.243578 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8fb835afc7538bbd3ba58671173aadcd9dd3daa9416005f35d22925b18560e9" Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.244857 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xw7fm" event={"ID":"d39d05ac-72e8-4449-8b55-1b4126c64554","Type":"ContainerDied","Data":"ec3ca880005d9ed7915e1c75229ced225a4c5267c894fd6d3a4ace9ee481e0e2"} Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.244901 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xw7fm" Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.244907 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec3ca880005d9ed7915e1c75229ced225a4c5267c894fd6d3a4ace9ee481e0e2" Oct 03 18:33:43 crc kubenswrapper[4835]: I1003 18:33:43.803930 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:44 crc kubenswrapper[4835]: I1003 18:33:44.255090 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef0348-5bc7-4502-852d-20ad6663b871","Type":"ContainerStarted","Data":"194c141bfe3b20470ac29c5a99f7387bd4d31054ae4e823ff0e0c67bf5a1c64a"} Oct 03 18:33:45 crc kubenswrapper[4835]: I1003 18:33:45.265105 4835 generic.go:334] "Generic (PLEG): container finished" podID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerID="97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c" exitCode=1 Oct 03 18:33:45 crc kubenswrapper[4835]: I1003 18:33:45.265460 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8110d0e5-9e19-4306-b8aa-babe937e8d2a","Type":"ContainerDied","Data":"97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c"} Oct 03 18:33:45 crc kubenswrapper[4835]: I1003 18:33:45.265507 4835 scope.go:117] "RemoveContainer" containerID="93a43d0e5cc9cf9fc0d017857367c8f6de72ed7cb5c9b28e99b416df23f0b755" Oct 03 18:33:45 crc kubenswrapper[4835]: I1003 18:33:45.266431 4835 scope.go:117] "RemoveContainer" containerID="97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c" Oct 03 18:33:45 crc kubenswrapper[4835]: E1003 18:33:45.266871 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8110d0e5-9e19-4306-b8aa-babe937e8d2a)\"" pod="openstack/watcher-decision-engine-0" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" Oct 03 18:33:46 crc kubenswrapper[4835]: I1003 18:33:46.276508 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef0348-5bc7-4502-852d-20ad6663b871","Type":"ContainerStarted","Data":"45257fb09a143a899bdbc2ebb874a9f1e5cd6bf8a23b262b6982a5e661c2dfdd"} Oct 03 18:33:46 crc kubenswrapper[4835]: I1003 18:33:46.277314 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="ceilometer-central-agent" containerID="cri-o://f6bcffd6a8d3a25ec1e2fe103a4b6fa1764d15b46f884f1814a4ba07682118cc" gracePeriod=30 Oct 03 18:33:46 crc kubenswrapper[4835]: I1003 18:33:46.277855 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 18:33:46 crc kubenswrapper[4835]: I1003 18:33:46.278233 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="sg-core" containerID="cri-o://194c141bfe3b20470ac29c5a99f7387bd4d31054ae4e823ff0e0c67bf5a1c64a" gracePeriod=30 Oct 03 18:33:46 crc kubenswrapper[4835]: I1003 18:33:46.278234 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="proxy-httpd" containerID="cri-o://45257fb09a143a899bdbc2ebb874a9f1e5cd6bf8a23b262b6982a5e661c2dfdd" gracePeriod=30 Oct 03 18:33:46 crc kubenswrapper[4835]: I1003 18:33:46.278321 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="ceilometer-notification-agent" containerID="cri-o://2c3b0be003cc2a1b33d30879456544195864250b81be6d690b4b419f2a146cf1" gracePeriod=30 Oct 03 18:33:46 crc kubenswrapper[4835]: I1003 18:33:46.311405 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.143679912 podStartE2EDuration="6.311381902s" podCreationTimestamp="2025-10-03 18:33:40 +0000 UTC" firstStartedPulling="2025-10-03 18:33:41.281633963 +0000 UTC m=+1162.997574835" lastFinishedPulling="2025-10-03 18:33:45.449335953 +0000 UTC m=+1167.165276825" observedRunningTime="2025-10-03 18:33:46.298796285 +0000 UTC m=+1168.014737157" watchObservedRunningTime="2025-10-03 18:33:46.311381902 +0000 UTC m=+1168.027322774" Oct 03 18:33:47 crc kubenswrapper[4835]: I1003 18:33:47.298852 4835 generic.go:334] "Generic (PLEG): container finished" podID="4bef0348-5bc7-4502-852d-20ad6663b871" containerID="45257fb09a143a899bdbc2ebb874a9f1e5cd6bf8a23b262b6982a5e661c2dfdd" exitCode=0 Oct 03 18:33:47 crc kubenswrapper[4835]: I1003 18:33:47.299184 4835 generic.go:334] "Generic (PLEG): container finished" podID="4bef0348-5bc7-4502-852d-20ad6663b871" containerID="194c141bfe3b20470ac29c5a99f7387bd4d31054ae4e823ff0e0c67bf5a1c64a" exitCode=2 Oct 03 18:33:47 crc kubenswrapper[4835]: I1003 18:33:47.299196 4835 generic.go:334] "Generic (PLEG): container finished" podID="4bef0348-5bc7-4502-852d-20ad6663b871" containerID="2c3b0be003cc2a1b33d30879456544195864250b81be6d690b4b419f2a146cf1" exitCode=0 Oct 03 18:33:47 crc kubenswrapper[4835]: I1003 18:33:47.298922 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef0348-5bc7-4502-852d-20ad6663b871","Type":"ContainerDied","Data":"45257fb09a143a899bdbc2ebb874a9f1e5cd6bf8a23b262b6982a5e661c2dfdd"} Oct 03 18:33:47 crc kubenswrapper[4835]: I1003 18:33:47.299232 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef0348-5bc7-4502-852d-20ad6663b871","Type":"ContainerDied","Data":"194c141bfe3b20470ac29c5a99f7387bd4d31054ae4e823ff0e0c67bf5a1c64a"} Oct 03 18:33:47 crc kubenswrapper[4835]: I1003 18:33:47.299246 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef0348-5bc7-4502-852d-20ad6663b871","Type":"ContainerDied","Data":"2c3b0be003cc2a1b33d30879456544195864250b81be6d690b4b419f2a146cf1"} Oct 03 18:33:47 crc kubenswrapper[4835]: I1003 18:33:47.419003 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:47 crc kubenswrapper[4835]: I1003 18:33:47.631399 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:47 crc kubenswrapper[4835]: I1003 18:33:47.637150 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75bfc94c9f-srgmb" Oct 03 18:33:48 crc kubenswrapper[4835]: I1003 18:33:48.166040 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 18:33:48 crc kubenswrapper[4835]: I1003 18:33:48.166102 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 18:33:48 crc kubenswrapper[4835]: I1003 18:33:48.166734 4835 scope.go:117] "RemoveContainer" containerID="97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c" Oct 03 18:33:48 crc kubenswrapper[4835]: E1003 18:33:48.166963 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8110d0e5-9e19-4306-b8aa-babe937e8d2a)\"" pod="openstack/watcher-decision-engine-0" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.193998 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.320498 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-sg-core-conf-yaml\") pod \"4bef0348-5bc7-4502-852d-20ad6663b871\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.320549 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-combined-ca-bundle\") pod \"4bef0348-5bc7-4502-852d-20ad6663b871\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.320621 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-config-data\") pod \"4bef0348-5bc7-4502-852d-20ad6663b871\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.320668 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef0348-5bc7-4502-852d-20ad6663b871-log-httpd\") pod \"4bef0348-5bc7-4502-852d-20ad6663b871\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.320738 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef0348-5bc7-4502-852d-20ad6663b871-run-httpd\") pod \"4bef0348-5bc7-4502-852d-20ad6663b871\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.320799 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgmfb\" (UniqueName: \"kubernetes.io/projected/4bef0348-5bc7-4502-852d-20ad6663b871-kube-api-access-zgmfb\") pod \"4bef0348-5bc7-4502-852d-20ad6663b871\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.320907 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-scripts\") pod \"4bef0348-5bc7-4502-852d-20ad6663b871\" (UID: \"4bef0348-5bc7-4502-852d-20ad6663b871\") " Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.321336 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bef0348-5bc7-4502-852d-20ad6663b871-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4bef0348-5bc7-4502-852d-20ad6663b871" (UID: "4bef0348-5bc7-4502-852d-20ad6663b871"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.321558 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bef0348-5bc7-4502-852d-20ad6663b871-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4bef0348-5bc7-4502-852d-20ad6663b871" (UID: "4bef0348-5bc7-4502-852d-20ad6663b871"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.332297 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bef0348-5bc7-4502-852d-20ad6663b871-kube-api-access-zgmfb" (OuterVolumeSpecName: "kube-api-access-zgmfb") pod "4bef0348-5bc7-4502-852d-20ad6663b871" (UID: "4bef0348-5bc7-4502-852d-20ad6663b871"). InnerVolumeSpecName "kube-api-access-zgmfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.340896 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-scripts" (OuterVolumeSpecName: "scripts") pod "4bef0348-5bc7-4502-852d-20ad6663b871" (UID: "4bef0348-5bc7-4502-852d-20ad6663b871"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.344725 4835 generic.go:334] "Generic (PLEG): container finished" podID="4bef0348-5bc7-4502-852d-20ad6663b871" containerID="f6bcffd6a8d3a25ec1e2fe103a4b6fa1764d15b46f884f1814a4ba07682118cc" exitCode=0 Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.344771 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef0348-5bc7-4502-852d-20ad6663b871","Type":"ContainerDied","Data":"f6bcffd6a8d3a25ec1e2fe103a4b6fa1764d15b46f884f1814a4ba07682118cc"} Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.344798 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4bef0348-5bc7-4502-852d-20ad6663b871","Type":"ContainerDied","Data":"90ec8ed2e3e95220ad8ed45afd6f50c6bf16b2e03f5e1a1e848e943c9baacb0a"} Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.344796 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.344823 4835 scope.go:117] "RemoveContainer" containerID="45257fb09a143a899bdbc2ebb874a9f1e5cd6bf8a23b262b6982a5e661c2dfdd" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.372816 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4bef0348-5bc7-4502-852d-20ad6663b871" (UID: "4bef0348-5bc7-4502-852d-20ad6663b871"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.424039 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgmfb\" (UniqueName: \"kubernetes.io/projected/4bef0348-5bc7-4502-852d-20ad6663b871-kube-api-access-zgmfb\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.424444 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.424457 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.424493 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef0348-5bc7-4502-852d-20ad6663b871-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.424505 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4bef0348-5bc7-4502-852d-20ad6663b871-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.428490 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bef0348-5bc7-4502-852d-20ad6663b871" (UID: "4bef0348-5bc7-4502-852d-20ad6663b871"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.446460 4835 scope.go:117] "RemoveContainer" containerID="194c141bfe3b20470ac29c5a99f7387bd4d31054ae4e823ff0e0c67bf5a1c64a" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.468279 4835 scope.go:117] "RemoveContainer" containerID="2c3b0be003cc2a1b33d30879456544195864250b81be6d690b4b419f2a146cf1" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.473919 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-84d4b96669-666zm" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.480285 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-config-data" (OuterVolumeSpecName: "config-data") pod "4bef0348-5bc7-4502-852d-20ad6663b871" (UID: "4bef0348-5bc7-4502-852d-20ad6663b871"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.490559 4835 scope.go:117] "RemoveContainer" containerID="f6bcffd6a8d3a25ec1e2fe103a4b6fa1764d15b46f884f1814a4ba07682118cc" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.526198 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.526244 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bef0348-5bc7-4502-852d-20ad6663b871-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.533264 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bc8b99c48-vnw8l"] Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.533510 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bc8b99c48-vnw8l" podUID="d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" containerName="neutron-api" containerID="cri-o://e1e711ba78a8bc75bfb1e51e87b4144137d3d0a868e3ef228156096a8eb936b8" gracePeriod=30 Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.533933 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bc8b99c48-vnw8l" podUID="d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" containerName="neutron-httpd" containerID="cri-o://420f7d3d203ffe2fdbff1a6c97f48d176d2b897cfde5fc2523caa34a448cf47a" gracePeriod=30 Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.534118 4835 scope.go:117] "RemoveContainer" containerID="45257fb09a143a899bdbc2ebb874a9f1e5cd6bf8a23b262b6982a5e661c2dfdd" Oct 03 18:33:50 crc kubenswrapper[4835]: E1003 18:33:50.537385 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45257fb09a143a899bdbc2ebb874a9f1e5cd6bf8a23b262b6982a5e661c2dfdd\": container with ID starting with 45257fb09a143a899bdbc2ebb874a9f1e5cd6bf8a23b262b6982a5e661c2dfdd not found: ID does not exist" containerID="45257fb09a143a899bdbc2ebb874a9f1e5cd6bf8a23b262b6982a5e661c2dfdd" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.537423 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45257fb09a143a899bdbc2ebb874a9f1e5cd6bf8a23b262b6982a5e661c2dfdd"} err="failed to get container status \"45257fb09a143a899bdbc2ebb874a9f1e5cd6bf8a23b262b6982a5e661c2dfdd\": rpc error: code = NotFound desc = could not find container \"45257fb09a143a899bdbc2ebb874a9f1e5cd6bf8a23b262b6982a5e661c2dfdd\": container with ID starting with 45257fb09a143a899bdbc2ebb874a9f1e5cd6bf8a23b262b6982a5e661c2dfdd not found: ID does not exist" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.537445 4835 scope.go:117] "RemoveContainer" containerID="194c141bfe3b20470ac29c5a99f7387bd4d31054ae4e823ff0e0c67bf5a1c64a" Oct 03 18:33:50 crc kubenswrapper[4835]: E1003 18:33:50.542158 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"194c141bfe3b20470ac29c5a99f7387bd4d31054ae4e823ff0e0c67bf5a1c64a\": container with ID starting with 194c141bfe3b20470ac29c5a99f7387bd4d31054ae4e823ff0e0c67bf5a1c64a not found: ID does not exist" containerID="194c141bfe3b20470ac29c5a99f7387bd4d31054ae4e823ff0e0c67bf5a1c64a" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.542187 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"194c141bfe3b20470ac29c5a99f7387bd4d31054ae4e823ff0e0c67bf5a1c64a"} err="failed to get container status \"194c141bfe3b20470ac29c5a99f7387bd4d31054ae4e823ff0e0c67bf5a1c64a\": rpc error: code = NotFound desc = could not find container \"194c141bfe3b20470ac29c5a99f7387bd4d31054ae4e823ff0e0c67bf5a1c64a\": container with ID starting with 194c141bfe3b20470ac29c5a99f7387bd4d31054ae4e823ff0e0c67bf5a1c64a not found: ID does not exist" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.542206 4835 scope.go:117] "RemoveContainer" containerID="2c3b0be003cc2a1b33d30879456544195864250b81be6d690b4b419f2a146cf1" Oct 03 18:33:50 crc kubenswrapper[4835]: E1003 18:33:50.542830 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c3b0be003cc2a1b33d30879456544195864250b81be6d690b4b419f2a146cf1\": container with ID starting with 2c3b0be003cc2a1b33d30879456544195864250b81be6d690b4b419f2a146cf1 not found: ID does not exist" containerID="2c3b0be003cc2a1b33d30879456544195864250b81be6d690b4b419f2a146cf1" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.542852 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c3b0be003cc2a1b33d30879456544195864250b81be6d690b4b419f2a146cf1"} err="failed to get container status \"2c3b0be003cc2a1b33d30879456544195864250b81be6d690b4b419f2a146cf1\": rpc error: code = NotFound desc = could not find container \"2c3b0be003cc2a1b33d30879456544195864250b81be6d690b4b419f2a146cf1\": container with ID starting with 2c3b0be003cc2a1b33d30879456544195864250b81be6d690b4b419f2a146cf1 not found: ID does not exist" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.542885 4835 scope.go:117] "RemoveContainer" containerID="f6bcffd6a8d3a25ec1e2fe103a4b6fa1764d15b46f884f1814a4ba07682118cc" Oct 03 18:33:50 crc kubenswrapper[4835]: E1003 18:33:50.543493 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6bcffd6a8d3a25ec1e2fe103a4b6fa1764d15b46f884f1814a4ba07682118cc\": container with ID starting with f6bcffd6a8d3a25ec1e2fe103a4b6fa1764d15b46f884f1814a4ba07682118cc not found: ID does not exist" containerID="f6bcffd6a8d3a25ec1e2fe103a4b6fa1764d15b46f884f1814a4ba07682118cc" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.543518 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6bcffd6a8d3a25ec1e2fe103a4b6fa1764d15b46f884f1814a4ba07682118cc"} err="failed to get container status \"f6bcffd6a8d3a25ec1e2fe103a4b6fa1764d15b46f884f1814a4ba07682118cc\": rpc error: code = NotFound desc = could not find container \"f6bcffd6a8d3a25ec1e2fe103a4b6fa1764d15b46f884f1814a4ba07682118cc\": container with ID starting with f6bcffd6a8d3a25ec1e2fe103a4b6fa1764d15b46f884f1814a4ba07682118cc not found: ID does not exist" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.690127 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.709190 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.719272 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:50 crc kubenswrapper[4835]: E1003 18:33:50.719919 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2b577b-2a9c-4651-95f2-ad815b073b61" containerName="mariadb-database-create" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.720007 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2b577b-2a9c-4651-95f2-ad815b073b61" containerName="mariadb-database-create" Oct 03 18:33:50 crc kubenswrapper[4835]: E1003 18:33:50.720093 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="sg-core" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.720164 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="sg-core" Oct 03 18:33:50 crc kubenswrapper[4835]: E1003 18:33:50.720238 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="proxy-httpd" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.720294 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="proxy-httpd" Oct 03 18:33:50 crc kubenswrapper[4835]: E1003 18:33:50.720361 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="ceilometer-central-agent" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.720410 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="ceilometer-central-agent" Oct 03 18:33:50 crc kubenswrapper[4835]: E1003 18:33:50.720469 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31632d24-3d3f-438c-a447-1a38f58ac87b" containerName="mariadb-database-create" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.720522 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="31632d24-3d3f-438c-a447-1a38f58ac87b" containerName="mariadb-database-create" Oct 03 18:33:50 crc kubenswrapper[4835]: E1003 18:33:50.720582 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="ceilometer-notification-agent" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.720640 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="ceilometer-notification-agent" Oct 03 18:33:50 crc kubenswrapper[4835]: E1003 18:33:50.720695 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d39d05ac-72e8-4449-8b55-1b4126c64554" containerName="mariadb-database-create" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.720755 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39d05ac-72e8-4449-8b55-1b4126c64554" containerName="mariadb-database-create" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.721033 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="31632d24-3d3f-438c-a447-1a38f58ac87b" containerName="mariadb-database-create" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.721121 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="proxy-httpd" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.721192 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc2b577b-2a9c-4651-95f2-ad815b073b61" containerName="mariadb-database-create" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.721255 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="ceilometer-notification-agent" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.721315 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="sg-core" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.721374 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d39d05ac-72e8-4449-8b55-1b4126c64554" containerName="mariadb-database-create" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.721428 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" containerName="ceilometer-central-agent" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.723278 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.729780 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.730505 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.731134 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.833797 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x9jn\" (UniqueName: \"kubernetes.io/projected/095f7b98-b6bf-424a-a33c-d8e601b65aa0-kube-api-access-7x9jn\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.833865 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095f7b98-b6bf-424a-a33c-d8e601b65aa0-log-httpd\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.833995 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095f7b98-b6bf-424a-a33c-d8e601b65aa0-run-httpd\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.834130 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-scripts\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.834221 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.834363 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.834404 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-config-data\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.890795 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bef0348-5bc7-4502-852d-20ad6663b871" path="/var/lib/kubelet/pods/4bef0348-5bc7-4502-852d-20ad6663b871/volumes" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.935968 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095f7b98-b6bf-424a-a33c-d8e601b65aa0-run-httpd\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.936024 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-scripts\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.936079 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.936132 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.936150 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-config-data\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.936215 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x9jn\" (UniqueName: \"kubernetes.io/projected/095f7b98-b6bf-424a-a33c-d8e601b65aa0-kube-api-access-7x9jn\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.936237 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095f7b98-b6bf-424a-a33c-d8e601b65aa0-log-httpd\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.936598 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095f7b98-b6bf-424a-a33c-d8e601b65aa0-log-httpd\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.936778 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095f7b98-b6bf-424a-a33c-d8e601b65aa0-run-httpd\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.940062 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.940238 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-scripts\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.940956 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-config-data\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.949688 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:50 crc kubenswrapper[4835]: I1003 18:33:50.952032 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x9jn\" (UniqueName: \"kubernetes.io/projected/095f7b98-b6bf-424a-a33c-d8e601b65aa0-kube-api-access-7x9jn\") pod \"ceilometer-0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " pod="openstack/ceilometer-0" Oct 03 18:33:51 crc kubenswrapper[4835]: I1003 18:33:51.061639 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:33:51 crc kubenswrapper[4835]: I1003 18:33:51.359709 4835 generic.go:334] "Generic (PLEG): container finished" podID="d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" containerID="420f7d3d203ffe2fdbff1a6c97f48d176d2b897cfde5fc2523caa34a448cf47a" exitCode=0 Oct 03 18:33:51 crc kubenswrapper[4835]: I1003 18:33:51.359782 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc8b99c48-vnw8l" event={"ID":"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3","Type":"ContainerDied","Data":"420f7d3d203ffe2fdbff1a6c97f48d176d2b897cfde5fc2523caa34a448cf47a"} Oct 03 18:33:51 crc kubenswrapper[4835]: I1003 18:33:51.501489 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:52 crc kubenswrapper[4835]: I1003 18:33:52.371928 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095f7b98-b6bf-424a-a33c-d8e601b65aa0","Type":"ContainerStarted","Data":"3d4d7e7e929fac6493cadd3d660ec98b3e17330084fa60291fca2c434bd9fdfc"} Oct 03 18:33:52 crc kubenswrapper[4835]: I1003 18:33:52.372502 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095f7b98-b6bf-424a-a33c-d8e601b65aa0","Type":"ContainerStarted","Data":"4c938b9751aef4a88f1be6023224d89631d84384843154a46986b2caadbfdfb2"} Oct 03 18:33:52 crc kubenswrapper[4835]: I1003 18:33:52.372513 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095f7b98-b6bf-424a-a33c-d8e601b65aa0","Type":"ContainerStarted","Data":"88942db75e9c474fea39d6fb9d1bf3391cc994cd1a0f9227d92f94fe0742d2f8"} Oct 03 18:33:53 crc kubenswrapper[4835]: I1003 18:33:53.383809 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095f7b98-b6bf-424a-a33c-d8e601b65aa0","Type":"ContainerStarted","Data":"9b651731199ee5a8694625c57a3db167b0a0bba1f1ae0e052a758c314e985ac3"} Oct 03 18:33:55 crc kubenswrapper[4835]: I1003 18:33:55.364525 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:33:55 crc kubenswrapper[4835]: I1003 18:33:55.404974 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095f7b98-b6bf-424a-a33c-d8e601b65aa0","Type":"ContainerStarted","Data":"d9a2ba0400c7d500bd74227cab1eeff30df2b0f0c231ef47800225e9121aafe2"} Oct 03 18:33:55 crc kubenswrapper[4835]: I1003 18:33:55.405204 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 18:33:55 crc kubenswrapper[4835]: I1003 18:33:55.431163 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.358589761 podStartE2EDuration="5.431146671s" podCreationTimestamp="2025-10-03 18:33:50 +0000 UTC" firstStartedPulling="2025-10-03 18:33:51.50964768 +0000 UTC m=+1173.225588552" lastFinishedPulling="2025-10-03 18:33:54.58220459 +0000 UTC m=+1176.298145462" observedRunningTime="2025-10-03 18:33:55.425480804 +0000 UTC m=+1177.141421676" watchObservedRunningTime="2025-10-03 18:33:55.431146671 +0000 UTC m=+1177.147087543" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.243570 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.341354 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-ovndb-tls-certs\") pod \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.341491 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-config\") pod \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.341568 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jp6x\" (UniqueName: \"kubernetes.io/projected/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-kube-api-access-2jp6x\") pod \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.341643 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-httpd-config\") pod \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.341690 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-combined-ca-bundle\") pod \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\" (UID: \"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3\") " Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.349315 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-kube-api-access-2jp6x" (OuterVolumeSpecName: "kube-api-access-2jp6x") pod "d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" (UID: "d775c7bb-fbc5-42d0-9f09-9edfac2d88e3"). InnerVolumeSpecName "kube-api-access-2jp6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.349997 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" (UID: "d775c7bb-fbc5-42d0-9f09-9edfac2d88e3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.408056 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" (UID: "d775c7bb-fbc5-42d0-9f09-9edfac2d88e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.420061 4835 generic.go:334] "Generic (PLEG): container finished" podID="d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" containerID="e1e711ba78a8bc75bfb1e51e87b4144137d3d0a868e3ef228156096a8eb936b8" exitCode=0 Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.420408 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="ceilometer-central-agent" containerID="cri-o://4c938b9751aef4a88f1be6023224d89631d84384843154a46986b2caadbfdfb2" gracePeriod=30 Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.422064 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bc8b99c48-vnw8l" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.422524 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="proxy-httpd" containerID="cri-o://d9a2ba0400c7d500bd74227cab1eeff30df2b0f0c231ef47800225e9121aafe2" gracePeriod=30 Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.422636 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="sg-core" containerID="cri-o://9b651731199ee5a8694625c57a3db167b0a0bba1f1ae0e052a758c314e985ac3" gracePeriod=30 Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.422656 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc8b99c48-vnw8l" event={"ID":"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3","Type":"ContainerDied","Data":"e1e711ba78a8bc75bfb1e51e87b4144137d3d0a868e3ef228156096a8eb936b8"} Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.422696 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="ceilometer-notification-agent" containerID="cri-o://3d4d7e7e929fac6493cadd3d660ec98b3e17330084fa60291fca2c434bd9fdfc" gracePeriod=30 Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.422712 4835 scope.go:117] "RemoveContainer" containerID="420f7d3d203ffe2fdbff1a6c97f48d176d2b897cfde5fc2523caa34a448cf47a" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.422700 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc8b99c48-vnw8l" event={"ID":"d775c7bb-fbc5-42d0-9f09-9edfac2d88e3","Type":"ContainerDied","Data":"2fa3eb268d80fa91270fa7fc799e9cef57a9ad0e20620a4b1ee4b0ad66594dbe"} Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.451510 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.454484 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.454620 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jp6x\" (UniqueName: \"kubernetes.io/projected/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-kube-api-access-2jp6x\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.464418 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0521-account-create-bb2fh"] Oct 03 18:33:56 crc kubenswrapper[4835]: E1003 18:33:56.469255 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" containerName="neutron-api" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.469278 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" containerName="neutron-api" Oct 03 18:33:56 crc kubenswrapper[4835]: E1003 18:33:56.469296 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" containerName="neutron-httpd" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.469303 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" containerName="neutron-httpd" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.469926 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" containerName="neutron-api" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.469937 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" containerName="neutron-httpd" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.480033 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0521-account-create-bb2fh" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.484557 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.493351 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-config" (OuterVolumeSpecName: "config") pod "d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" (UID: "d775c7bb-fbc5-42d0-9f09-9edfac2d88e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.494233 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0521-account-create-bb2fh"] Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.495614 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" (UID: "d775c7bb-fbc5-42d0-9f09-9edfac2d88e3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.498776 4835 scope.go:117] "RemoveContainer" containerID="e1e711ba78a8bc75bfb1e51e87b4144137d3d0a868e3ef228156096a8eb936b8" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.522879 4835 scope.go:117] "RemoveContainer" containerID="420f7d3d203ffe2fdbff1a6c97f48d176d2b897cfde5fc2523caa34a448cf47a" Oct 03 18:33:56 crc kubenswrapper[4835]: E1003 18:33:56.523400 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420f7d3d203ffe2fdbff1a6c97f48d176d2b897cfde5fc2523caa34a448cf47a\": container with ID starting with 420f7d3d203ffe2fdbff1a6c97f48d176d2b897cfde5fc2523caa34a448cf47a not found: ID does not exist" containerID="420f7d3d203ffe2fdbff1a6c97f48d176d2b897cfde5fc2523caa34a448cf47a" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.523443 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420f7d3d203ffe2fdbff1a6c97f48d176d2b897cfde5fc2523caa34a448cf47a"} err="failed to get container status \"420f7d3d203ffe2fdbff1a6c97f48d176d2b897cfde5fc2523caa34a448cf47a\": rpc error: code = NotFound desc = could not find container \"420f7d3d203ffe2fdbff1a6c97f48d176d2b897cfde5fc2523caa34a448cf47a\": container with ID starting with 420f7d3d203ffe2fdbff1a6c97f48d176d2b897cfde5fc2523caa34a448cf47a not found: ID does not exist" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.523471 4835 scope.go:117] "RemoveContainer" containerID="e1e711ba78a8bc75bfb1e51e87b4144137d3d0a868e3ef228156096a8eb936b8" Oct 03 18:33:56 crc kubenswrapper[4835]: E1003 18:33:56.524042 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e711ba78a8bc75bfb1e51e87b4144137d3d0a868e3ef228156096a8eb936b8\": container with ID starting with e1e711ba78a8bc75bfb1e51e87b4144137d3d0a868e3ef228156096a8eb936b8 not found: ID does not exist" containerID="e1e711ba78a8bc75bfb1e51e87b4144137d3d0a868e3ef228156096a8eb936b8" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.524098 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e711ba78a8bc75bfb1e51e87b4144137d3d0a868e3ef228156096a8eb936b8"} err="failed to get container status \"e1e711ba78a8bc75bfb1e51e87b4144137d3d0a868e3ef228156096a8eb936b8\": rpc error: code = NotFound desc = could not find container \"e1e711ba78a8bc75bfb1e51e87b4144137d3d0a868e3ef228156096a8eb936b8\": container with ID starting with e1e711ba78a8bc75bfb1e51e87b4144137d3d0a868e3ef228156096a8eb936b8 not found: ID does not exist" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.556104 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zmxb\" (UniqueName: \"kubernetes.io/projected/b76843d0-52ea-4ef4-ad05-460c819fed88-kube-api-access-5zmxb\") pod \"nova-api-0521-account-create-bb2fh\" (UID: \"b76843d0-52ea-4ef4-ad05-460c819fed88\") " pod="openstack/nova-api-0521-account-create-bb2fh" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.556687 4835 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.556785 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.574884 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4642-account-create-9fv4f"] Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.577357 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4642-account-create-9fv4f" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.579642 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.581152 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4642-account-create-9fv4f"] Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.658662 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtsgg\" (UniqueName: \"kubernetes.io/projected/64736945-0cbb-43dc-9311-4c03123c878b-kube-api-access-vtsgg\") pod \"nova-cell0-4642-account-create-9fv4f\" (UID: \"64736945-0cbb-43dc-9311-4c03123c878b\") " pod="openstack/nova-cell0-4642-account-create-9fv4f" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.658725 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zmxb\" (UniqueName: \"kubernetes.io/projected/b76843d0-52ea-4ef4-ad05-460c819fed88-kube-api-access-5zmxb\") pod \"nova-api-0521-account-create-bb2fh\" (UID: \"b76843d0-52ea-4ef4-ad05-460c819fed88\") " pod="openstack/nova-api-0521-account-create-bb2fh" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.677948 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zmxb\" (UniqueName: \"kubernetes.io/projected/b76843d0-52ea-4ef4-ad05-460c819fed88-kube-api-access-5zmxb\") pod \"nova-api-0521-account-create-bb2fh\" (UID: \"b76843d0-52ea-4ef4-ad05-460c819fed88\") " pod="openstack/nova-api-0521-account-create-bb2fh" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.761752 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtsgg\" (UniqueName: \"kubernetes.io/projected/64736945-0cbb-43dc-9311-4c03123c878b-kube-api-access-vtsgg\") pod \"nova-cell0-4642-account-create-9fv4f\" (UID: \"64736945-0cbb-43dc-9311-4c03123c878b\") " pod="openstack/nova-cell0-4642-account-create-9fv4f" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.780383 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtsgg\" (UniqueName: \"kubernetes.io/projected/64736945-0cbb-43dc-9311-4c03123c878b-kube-api-access-vtsgg\") pod \"nova-cell0-4642-account-create-9fv4f\" (UID: \"64736945-0cbb-43dc-9311-4c03123c878b\") " pod="openstack/nova-cell0-4642-account-create-9fv4f" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.798119 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bc8b99c48-vnw8l"] Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.806305 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bc8b99c48-vnw8l"] Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.812517 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0521-account-create-bb2fh" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.831676 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3b08-account-create-ddf9g"] Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.832957 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3b08-account-create-ddf9g" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.838377 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.859711 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3b08-account-create-ddf9g"] Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.907653 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4642-account-create-9fv4f" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.912851 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d775c7bb-fbc5-42d0-9f09-9edfac2d88e3" path="/var/lib/kubelet/pods/d775c7bb-fbc5-42d0-9f09-9edfac2d88e3/volumes" Oct 03 18:33:56 crc kubenswrapper[4835]: I1003 18:33:56.968152 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhr2s\" (UniqueName: \"kubernetes.io/projected/5041363c-5aa7-4917-9a7e-0c3fbc478222-kube-api-access-zhr2s\") pod \"nova-cell1-3b08-account-create-ddf9g\" (UID: \"5041363c-5aa7-4917-9a7e-0c3fbc478222\") " pod="openstack/nova-cell1-3b08-account-create-ddf9g" Oct 03 18:33:57 crc kubenswrapper[4835]: I1003 18:33:57.071651 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhr2s\" (UniqueName: \"kubernetes.io/projected/5041363c-5aa7-4917-9a7e-0c3fbc478222-kube-api-access-zhr2s\") pod \"nova-cell1-3b08-account-create-ddf9g\" (UID: \"5041363c-5aa7-4917-9a7e-0c3fbc478222\") " pod="openstack/nova-cell1-3b08-account-create-ddf9g" Oct 03 18:33:57 crc kubenswrapper[4835]: I1003 18:33:57.101893 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhr2s\" (UniqueName: \"kubernetes.io/projected/5041363c-5aa7-4917-9a7e-0c3fbc478222-kube-api-access-zhr2s\") pod \"nova-cell1-3b08-account-create-ddf9g\" (UID: \"5041363c-5aa7-4917-9a7e-0c3fbc478222\") " pod="openstack/nova-cell1-3b08-account-create-ddf9g" Oct 03 18:33:57 crc kubenswrapper[4835]: I1003 18:33:57.291425 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3b08-account-create-ddf9g" Oct 03 18:33:57 crc kubenswrapper[4835]: I1003 18:33:57.435216 4835 generic.go:334] "Generic (PLEG): container finished" podID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerID="d9a2ba0400c7d500bd74227cab1eeff30df2b0f0c231ef47800225e9121aafe2" exitCode=0 Oct 03 18:33:57 crc kubenswrapper[4835]: I1003 18:33:57.435240 4835 generic.go:334] "Generic (PLEG): container finished" podID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerID="9b651731199ee5a8694625c57a3db167b0a0bba1f1ae0e052a758c314e985ac3" exitCode=2 Oct 03 18:33:57 crc kubenswrapper[4835]: I1003 18:33:57.435247 4835 generic.go:334] "Generic (PLEG): container finished" podID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerID="3d4d7e7e929fac6493cadd3d660ec98b3e17330084fa60291fca2c434bd9fdfc" exitCode=0 Oct 03 18:33:57 crc kubenswrapper[4835]: I1003 18:33:57.435262 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095f7b98-b6bf-424a-a33c-d8e601b65aa0","Type":"ContainerDied","Data":"d9a2ba0400c7d500bd74227cab1eeff30df2b0f0c231ef47800225e9121aafe2"} Oct 03 18:33:57 crc kubenswrapper[4835]: I1003 18:33:57.435283 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095f7b98-b6bf-424a-a33c-d8e601b65aa0","Type":"ContainerDied","Data":"9b651731199ee5a8694625c57a3db167b0a0bba1f1ae0e052a758c314e985ac3"} Oct 03 18:33:57 crc kubenswrapper[4835]: I1003 18:33:57.435292 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095f7b98-b6bf-424a-a33c-d8e601b65aa0","Type":"ContainerDied","Data":"3d4d7e7e929fac6493cadd3d660ec98b3e17330084fa60291fca2c434bd9fdfc"} Oct 03 18:33:57 crc kubenswrapper[4835]: W1003 18:33:57.441562 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64736945_0cbb_43dc_9311_4c03123c878b.slice/crio-0258c73bdaf264a6456677c9d1a8d11cfbd3502a2a7aceda47a7e1bbc6bdf1d3 WatchSource:0}: Error finding container 0258c73bdaf264a6456677c9d1a8d11cfbd3502a2a7aceda47a7e1bbc6bdf1d3: Status 404 returned error can't find the container with id 0258c73bdaf264a6456677c9d1a8d11cfbd3502a2a7aceda47a7e1bbc6bdf1d3 Oct 03 18:33:57 crc kubenswrapper[4835]: I1003 18:33:57.443362 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4642-account-create-9fv4f"] Oct 03 18:33:57 crc kubenswrapper[4835]: I1003 18:33:57.508998 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0521-account-create-bb2fh"] Oct 03 18:33:57 crc kubenswrapper[4835]: W1003 18:33:57.513398 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb76843d0_52ea_4ef4_ad05_460c819fed88.slice/crio-33f5b72441fb6fff6ef839fcb7c67fd467cfd880b6c439da918ffee4caa4a818 WatchSource:0}: Error finding container 33f5b72441fb6fff6ef839fcb7c67fd467cfd880b6c439da918ffee4caa4a818: Status 404 returned error can't find the container with id 33f5b72441fb6fff6ef839fcb7c67fd467cfd880b6c439da918ffee4caa4a818 Oct 03 18:33:57 crc kubenswrapper[4835]: I1003 18:33:57.754364 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3b08-account-create-ddf9g"] Oct 03 18:33:57 crc kubenswrapper[4835]: W1003 18:33:57.819637 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5041363c_5aa7_4917_9a7e_0c3fbc478222.slice/crio-fb16a1ab103034a501c910a0ffe17eca349c3118de3afb9b8bc445bfb090bb58 WatchSource:0}: Error finding container fb16a1ab103034a501c910a0ffe17eca349c3118de3afb9b8bc445bfb090bb58: Status 404 returned error can't find the container with id fb16a1ab103034a501c910a0ffe17eca349c3118de3afb9b8bc445bfb090bb58 Oct 03 18:33:58 crc kubenswrapper[4835]: I1003 18:33:58.447019 4835 generic.go:334] "Generic (PLEG): container finished" podID="b76843d0-52ea-4ef4-ad05-460c819fed88" containerID="503b690c67c08bddcd3404403757244f7fe7841067c825256213f9247328855a" exitCode=0 Oct 03 18:33:58 crc kubenswrapper[4835]: I1003 18:33:58.448546 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0521-account-create-bb2fh" event={"ID":"b76843d0-52ea-4ef4-ad05-460c819fed88","Type":"ContainerDied","Data":"503b690c67c08bddcd3404403757244f7fe7841067c825256213f9247328855a"} Oct 03 18:33:58 crc kubenswrapper[4835]: I1003 18:33:58.448659 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0521-account-create-bb2fh" event={"ID":"b76843d0-52ea-4ef4-ad05-460c819fed88","Type":"ContainerStarted","Data":"33f5b72441fb6fff6ef839fcb7c67fd467cfd880b6c439da918ffee4caa4a818"} Oct 03 18:33:58 crc kubenswrapper[4835]: I1003 18:33:58.451064 4835 generic.go:334] "Generic (PLEG): container finished" podID="64736945-0cbb-43dc-9311-4c03123c878b" containerID="97a200106343d17c7b5557a76f0f908bf71800bbbd2581afc6ddfc69e637248a" exitCode=0 Oct 03 18:33:58 crc kubenswrapper[4835]: I1003 18:33:58.451237 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4642-account-create-9fv4f" event={"ID":"64736945-0cbb-43dc-9311-4c03123c878b","Type":"ContainerDied","Data":"97a200106343d17c7b5557a76f0f908bf71800bbbd2581afc6ddfc69e637248a"} Oct 03 18:33:58 crc kubenswrapper[4835]: I1003 18:33:58.451349 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4642-account-create-9fv4f" event={"ID":"64736945-0cbb-43dc-9311-4c03123c878b","Type":"ContainerStarted","Data":"0258c73bdaf264a6456677c9d1a8d11cfbd3502a2a7aceda47a7e1bbc6bdf1d3"} Oct 03 18:33:58 crc kubenswrapper[4835]: I1003 18:33:58.453662 4835 generic.go:334] "Generic (PLEG): container finished" podID="5041363c-5aa7-4917-9a7e-0c3fbc478222" containerID="186d768c80cff48d5e649af26c5f1e1742246a8f2c5cf136125676814e95f320" exitCode=0 Oct 03 18:33:58 crc kubenswrapper[4835]: I1003 18:33:58.453780 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3b08-account-create-ddf9g" event={"ID":"5041363c-5aa7-4917-9a7e-0c3fbc478222","Type":"ContainerDied","Data":"186d768c80cff48d5e649af26c5f1e1742246a8f2c5cf136125676814e95f320"} Oct 03 18:33:58 crc kubenswrapper[4835]: I1003 18:33:58.453862 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3b08-account-create-ddf9g" event={"ID":"5041363c-5aa7-4917-9a7e-0c3fbc478222","Type":"ContainerStarted","Data":"fb16a1ab103034a501c910a0ffe17eca349c3118de3afb9b8bc445bfb090bb58"} Oct 03 18:33:58 crc kubenswrapper[4835]: I1003 18:33:58.882768 4835 scope.go:117] "RemoveContainer" containerID="97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c" Oct 03 18:33:58 crc kubenswrapper[4835]: E1003 18:33:58.883056 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8110d0e5-9e19-4306-b8aa-babe937e8d2a)\"" pod="openstack/watcher-decision-engine-0" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.041384 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4642-account-create-9fv4f" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.050892 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0521-account-create-bb2fh" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.056155 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3b08-account-create-ddf9g" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.140219 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhr2s\" (UniqueName: \"kubernetes.io/projected/5041363c-5aa7-4917-9a7e-0c3fbc478222-kube-api-access-zhr2s\") pod \"5041363c-5aa7-4917-9a7e-0c3fbc478222\" (UID: \"5041363c-5aa7-4917-9a7e-0c3fbc478222\") " Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.140445 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtsgg\" (UniqueName: \"kubernetes.io/projected/64736945-0cbb-43dc-9311-4c03123c878b-kube-api-access-vtsgg\") pod \"64736945-0cbb-43dc-9311-4c03123c878b\" (UID: \"64736945-0cbb-43dc-9311-4c03123c878b\") " Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.140477 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zmxb\" (UniqueName: \"kubernetes.io/projected/b76843d0-52ea-4ef4-ad05-460c819fed88-kube-api-access-5zmxb\") pod \"b76843d0-52ea-4ef4-ad05-460c819fed88\" (UID: \"b76843d0-52ea-4ef4-ad05-460c819fed88\") " Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.148475 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64736945-0cbb-43dc-9311-4c03123c878b-kube-api-access-vtsgg" (OuterVolumeSpecName: "kube-api-access-vtsgg") pod "64736945-0cbb-43dc-9311-4c03123c878b" (UID: "64736945-0cbb-43dc-9311-4c03123c878b"). InnerVolumeSpecName "kube-api-access-vtsgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.154298 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5041363c-5aa7-4917-9a7e-0c3fbc478222-kube-api-access-zhr2s" (OuterVolumeSpecName: "kube-api-access-zhr2s") pod "5041363c-5aa7-4917-9a7e-0c3fbc478222" (UID: "5041363c-5aa7-4917-9a7e-0c3fbc478222"). InnerVolumeSpecName "kube-api-access-zhr2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.176246 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76843d0-52ea-4ef4-ad05-460c819fed88-kube-api-access-5zmxb" (OuterVolumeSpecName: "kube-api-access-5zmxb") pod "b76843d0-52ea-4ef4-ad05-460c819fed88" (UID: "b76843d0-52ea-4ef4-ad05-460c819fed88"). InnerVolumeSpecName "kube-api-access-5zmxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.243106 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhr2s\" (UniqueName: \"kubernetes.io/projected/5041363c-5aa7-4917-9a7e-0c3fbc478222-kube-api-access-zhr2s\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.243315 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtsgg\" (UniqueName: \"kubernetes.io/projected/64736945-0cbb-43dc-9311-4c03123c878b-kube-api-access-vtsgg\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.243373 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zmxb\" (UniqueName: \"kubernetes.io/projected/b76843d0-52ea-4ef4-ad05-460c819fed88-kube-api-access-5zmxb\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.386311 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.476289 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3b08-account-create-ddf9g" event={"ID":"5041363c-5aa7-4917-9a7e-0c3fbc478222","Type":"ContainerDied","Data":"fb16a1ab103034a501c910a0ffe17eca349c3118de3afb9b8bc445bfb090bb58"} Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.476345 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb16a1ab103034a501c910a0ffe17eca349c3118de3afb9b8bc445bfb090bb58" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.476314 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3b08-account-create-ddf9g" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.481691 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0521-account-create-bb2fh" event={"ID":"b76843d0-52ea-4ef4-ad05-460c819fed88","Type":"ContainerDied","Data":"33f5b72441fb6fff6ef839fcb7c67fd467cfd880b6c439da918ffee4caa4a818"} Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.481739 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33f5b72441fb6fff6ef839fcb7c67fd467cfd880b6c439da918ffee4caa4a818" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.481817 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0521-account-create-bb2fh" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.490954 4835 generic.go:334] "Generic (PLEG): container finished" podID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerID="4c938b9751aef4a88f1be6023224d89631d84384843154a46986b2caadbfdfb2" exitCode=0 Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.491053 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.491064 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095f7b98-b6bf-424a-a33c-d8e601b65aa0","Type":"ContainerDied","Data":"4c938b9751aef4a88f1be6023224d89631d84384843154a46986b2caadbfdfb2"} Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.491353 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"095f7b98-b6bf-424a-a33c-d8e601b65aa0","Type":"ContainerDied","Data":"88942db75e9c474fea39d6fb9d1bf3391cc994cd1a0f9227d92f94fe0742d2f8"} Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.491383 4835 scope.go:117] "RemoveContainer" containerID="d9a2ba0400c7d500bd74227cab1eeff30df2b0f0c231ef47800225e9121aafe2" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.494959 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4642-account-create-9fv4f" event={"ID":"64736945-0cbb-43dc-9311-4c03123c878b","Type":"ContainerDied","Data":"0258c73bdaf264a6456677c9d1a8d11cfbd3502a2a7aceda47a7e1bbc6bdf1d3"} Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.494986 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0258c73bdaf264a6456677c9d1a8d11cfbd3502a2a7aceda47a7e1bbc6bdf1d3" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.495052 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4642-account-create-9fv4f" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.535049 4835 scope.go:117] "RemoveContainer" containerID="9b651731199ee5a8694625c57a3db167b0a0bba1f1ae0e052a758c314e985ac3" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.548046 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095f7b98-b6bf-424a-a33c-d8e601b65aa0-log-httpd\") pod \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.548147 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-combined-ca-bundle\") pod \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.548196 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-sg-core-conf-yaml\") pod \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.548220 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095f7b98-b6bf-424a-a33c-d8e601b65aa0-run-httpd\") pod \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.548289 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-scripts\") pod \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.548401 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-config-data\") pod \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.548429 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x9jn\" (UniqueName: \"kubernetes.io/projected/095f7b98-b6bf-424a-a33c-d8e601b65aa0-kube-api-access-7x9jn\") pod \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\" (UID: \"095f7b98-b6bf-424a-a33c-d8e601b65aa0\") " Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.548792 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/095f7b98-b6bf-424a-a33c-d8e601b65aa0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "095f7b98-b6bf-424a-a33c-d8e601b65aa0" (UID: "095f7b98-b6bf-424a-a33c-d8e601b65aa0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.548986 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/095f7b98-b6bf-424a-a33c-d8e601b65aa0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "095f7b98-b6bf-424a-a33c-d8e601b65aa0" (UID: "095f7b98-b6bf-424a-a33c-d8e601b65aa0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.549264 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095f7b98-b6bf-424a-a33c-d8e601b65aa0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.549282 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/095f7b98-b6bf-424a-a33c-d8e601b65aa0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.554329 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095f7b98-b6bf-424a-a33c-d8e601b65aa0-kube-api-access-7x9jn" (OuterVolumeSpecName: "kube-api-access-7x9jn") pod "095f7b98-b6bf-424a-a33c-d8e601b65aa0" (UID: "095f7b98-b6bf-424a-a33c-d8e601b65aa0"). InnerVolumeSpecName "kube-api-access-7x9jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.554670 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-scripts" (OuterVolumeSpecName: "scripts") pod "095f7b98-b6bf-424a-a33c-d8e601b65aa0" (UID: "095f7b98-b6bf-424a-a33c-d8e601b65aa0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.559485 4835 scope.go:117] "RemoveContainer" containerID="3d4d7e7e929fac6493cadd3d660ec98b3e17330084fa60291fca2c434bd9fdfc" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.582034 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "095f7b98-b6bf-424a-a33c-d8e601b65aa0" (UID: "095f7b98-b6bf-424a-a33c-d8e601b65aa0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.584517 4835 scope.go:117] "RemoveContainer" containerID="4c938b9751aef4a88f1be6023224d89631d84384843154a46986b2caadbfdfb2" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.615961 4835 scope.go:117] "RemoveContainer" containerID="d9a2ba0400c7d500bd74227cab1eeff30df2b0f0c231ef47800225e9121aafe2" Oct 03 18:34:00 crc kubenswrapper[4835]: E1003 18:34:00.616424 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9a2ba0400c7d500bd74227cab1eeff30df2b0f0c231ef47800225e9121aafe2\": container with ID starting with d9a2ba0400c7d500bd74227cab1eeff30df2b0f0c231ef47800225e9121aafe2 not found: ID does not exist" containerID="d9a2ba0400c7d500bd74227cab1eeff30df2b0f0c231ef47800225e9121aafe2" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.616462 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9a2ba0400c7d500bd74227cab1eeff30df2b0f0c231ef47800225e9121aafe2"} err="failed to get container status \"d9a2ba0400c7d500bd74227cab1eeff30df2b0f0c231ef47800225e9121aafe2\": rpc error: code = NotFound desc = could not find container \"d9a2ba0400c7d500bd74227cab1eeff30df2b0f0c231ef47800225e9121aafe2\": container with ID starting with d9a2ba0400c7d500bd74227cab1eeff30df2b0f0c231ef47800225e9121aafe2 not found: ID does not exist" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.616490 4835 scope.go:117] "RemoveContainer" containerID="9b651731199ee5a8694625c57a3db167b0a0bba1f1ae0e052a758c314e985ac3" Oct 03 18:34:00 crc kubenswrapper[4835]: E1003 18:34:00.616803 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b651731199ee5a8694625c57a3db167b0a0bba1f1ae0e052a758c314e985ac3\": container with ID starting with 9b651731199ee5a8694625c57a3db167b0a0bba1f1ae0e052a758c314e985ac3 not found: ID does not exist" containerID="9b651731199ee5a8694625c57a3db167b0a0bba1f1ae0e052a758c314e985ac3" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.616834 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b651731199ee5a8694625c57a3db167b0a0bba1f1ae0e052a758c314e985ac3"} err="failed to get container status \"9b651731199ee5a8694625c57a3db167b0a0bba1f1ae0e052a758c314e985ac3\": rpc error: code = NotFound desc = could not find container \"9b651731199ee5a8694625c57a3db167b0a0bba1f1ae0e052a758c314e985ac3\": container with ID starting with 9b651731199ee5a8694625c57a3db167b0a0bba1f1ae0e052a758c314e985ac3 not found: ID does not exist" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.616857 4835 scope.go:117] "RemoveContainer" containerID="3d4d7e7e929fac6493cadd3d660ec98b3e17330084fa60291fca2c434bd9fdfc" Oct 03 18:34:00 crc kubenswrapper[4835]: E1003 18:34:00.617213 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d4d7e7e929fac6493cadd3d660ec98b3e17330084fa60291fca2c434bd9fdfc\": container with ID starting with 3d4d7e7e929fac6493cadd3d660ec98b3e17330084fa60291fca2c434bd9fdfc not found: ID does not exist" containerID="3d4d7e7e929fac6493cadd3d660ec98b3e17330084fa60291fca2c434bd9fdfc" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.617245 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4d7e7e929fac6493cadd3d660ec98b3e17330084fa60291fca2c434bd9fdfc"} err="failed to get container status \"3d4d7e7e929fac6493cadd3d660ec98b3e17330084fa60291fca2c434bd9fdfc\": rpc error: code = NotFound desc = could not find container \"3d4d7e7e929fac6493cadd3d660ec98b3e17330084fa60291fca2c434bd9fdfc\": container with ID starting with 3d4d7e7e929fac6493cadd3d660ec98b3e17330084fa60291fca2c434bd9fdfc not found: ID does not exist" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.617261 4835 scope.go:117] "RemoveContainer" containerID="4c938b9751aef4a88f1be6023224d89631d84384843154a46986b2caadbfdfb2" Oct 03 18:34:00 crc kubenswrapper[4835]: E1003 18:34:00.617564 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c938b9751aef4a88f1be6023224d89631d84384843154a46986b2caadbfdfb2\": container with ID starting with 4c938b9751aef4a88f1be6023224d89631d84384843154a46986b2caadbfdfb2 not found: ID does not exist" containerID="4c938b9751aef4a88f1be6023224d89631d84384843154a46986b2caadbfdfb2" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.617588 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c938b9751aef4a88f1be6023224d89631d84384843154a46986b2caadbfdfb2"} err="failed to get container status \"4c938b9751aef4a88f1be6023224d89631d84384843154a46986b2caadbfdfb2\": rpc error: code = NotFound desc = could not find container \"4c938b9751aef4a88f1be6023224d89631d84384843154a46986b2caadbfdfb2\": container with ID starting with 4c938b9751aef4a88f1be6023224d89631d84384843154a46986b2caadbfdfb2 not found: ID does not exist" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.646295 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-config-data" (OuterVolumeSpecName: "config-data") pod "095f7b98-b6bf-424a-a33c-d8e601b65aa0" (UID: "095f7b98-b6bf-424a-a33c-d8e601b65aa0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.649046 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "095f7b98-b6bf-424a-a33c-d8e601b65aa0" (UID: "095f7b98-b6bf-424a-a33c-d8e601b65aa0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.651184 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.651210 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.651220 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.651228 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/095f7b98-b6bf-424a-a33c-d8e601b65aa0-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.651237 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x9jn\" (UniqueName: \"kubernetes.io/projected/095f7b98-b6bf-424a-a33c-d8e601b65aa0-kube-api-access-7x9jn\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.826193 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.837384 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.849555 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:00 crc kubenswrapper[4835]: E1003 18:34:00.850015 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64736945-0cbb-43dc-9311-4c03123c878b" containerName="mariadb-account-create" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.850036 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="64736945-0cbb-43dc-9311-4c03123c878b" containerName="mariadb-account-create" Oct 03 18:34:00 crc kubenswrapper[4835]: E1003 18:34:00.850326 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="ceilometer-notification-agent" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.850349 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="ceilometer-notification-agent" Oct 03 18:34:00 crc kubenswrapper[4835]: E1003 18:34:00.850387 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76843d0-52ea-4ef4-ad05-460c819fed88" containerName="mariadb-account-create" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.850398 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76843d0-52ea-4ef4-ad05-460c819fed88" containerName="mariadb-account-create" Oct 03 18:34:00 crc kubenswrapper[4835]: E1003 18:34:00.850419 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="proxy-httpd" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.850427 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="proxy-httpd" Oct 03 18:34:00 crc kubenswrapper[4835]: E1003 18:34:00.850449 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="sg-core" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.850457 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="sg-core" Oct 03 18:34:00 crc kubenswrapper[4835]: E1003 18:34:00.850470 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="ceilometer-central-agent" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.850478 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="ceilometer-central-agent" Oct 03 18:34:00 crc kubenswrapper[4835]: E1003 18:34:00.850505 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5041363c-5aa7-4917-9a7e-0c3fbc478222" containerName="mariadb-account-create" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.850514 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5041363c-5aa7-4917-9a7e-0c3fbc478222" containerName="mariadb-account-create" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.851108 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="proxy-httpd" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.851130 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5041363c-5aa7-4917-9a7e-0c3fbc478222" containerName="mariadb-account-create" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.851152 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76843d0-52ea-4ef4-ad05-460c819fed88" containerName="mariadb-account-create" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.851167 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="64736945-0cbb-43dc-9311-4c03123c878b" containerName="mariadb-account-create" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.851180 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="ceilometer-notification-agent" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.851198 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="sg-core" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.851212 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" containerName="ceilometer-central-agent" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.856589 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.860096 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.860462 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.860716 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.904350 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="095f7b98-b6bf-424a-a33c-d8e601b65aa0" path="/var/lib/kubelet/pods/095f7b98-b6bf-424a-a33c-d8e601b65aa0/volumes" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.957154 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-config-data\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.957255 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.957355 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23b69c74-f914-4313-8366-e950afef5d8b-log-httpd\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.957401 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.957435 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-scripts\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.957477 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23b69c74-f914-4313-8366-e950afef5d8b-run-httpd\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:00 crc kubenswrapper[4835]: I1003 18:34:00.957501 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhj5l\" (UniqueName: \"kubernetes.io/projected/23b69c74-f914-4313-8366-e950afef5d8b-kube-api-access-lhj5l\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.059105 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.059200 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23b69c74-f914-4313-8366-e950afef5d8b-log-httpd\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.059252 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.059283 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-scripts\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.059346 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23b69c74-f914-4313-8366-e950afef5d8b-run-httpd\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.059554 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhj5l\" (UniqueName: \"kubernetes.io/projected/23b69c74-f914-4313-8366-e950afef5d8b-kube-api-access-lhj5l\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.059637 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-config-data\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.059689 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23b69c74-f914-4313-8366-e950afef5d8b-log-httpd\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.059702 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23b69c74-f914-4313-8366-e950afef5d8b-run-httpd\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.063624 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-config-data\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.064396 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-scripts\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.066232 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.066708 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.080731 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhj5l\" (UniqueName: \"kubernetes.io/projected/23b69c74-f914-4313-8366-e950afef5d8b-kube-api-access-lhj5l\") pod \"ceilometer-0\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.220938 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.651370 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.839716 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c9pn5"] Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.841529 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.855292 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.855494 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qmmml" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.855675 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.862458 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c9pn5"] Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.978815 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mwrb\" (UniqueName: \"kubernetes.io/projected/9703db0d-8373-4427-98a0-c36d069f7c71-kube-api-access-4mwrb\") pod \"nova-cell0-conductor-db-sync-c9pn5\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.978948 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c9pn5\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.978975 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-config-data\") pod \"nova-cell0-conductor-db-sync-c9pn5\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:01 crc kubenswrapper[4835]: I1003 18:34:01.979056 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-scripts\") pod \"nova-cell0-conductor-db-sync-c9pn5\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.080390 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c9pn5\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.080459 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-config-data\") pod \"nova-cell0-conductor-db-sync-c9pn5\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.080521 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-scripts\") pod \"nova-cell0-conductor-db-sync-c9pn5\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.080635 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mwrb\" (UniqueName: \"kubernetes.io/projected/9703db0d-8373-4427-98a0-c36d069f7c71-kube-api-access-4mwrb\") pod \"nova-cell0-conductor-db-sync-c9pn5\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.084725 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-c9pn5\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.085139 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-scripts\") pod \"nova-cell0-conductor-db-sync-c9pn5\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.086426 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-config-data\") pod \"nova-cell0-conductor-db-sync-c9pn5\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.096684 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mwrb\" (UniqueName: \"kubernetes.io/projected/9703db0d-8373-4427-98a0-c36d069f7c71-kube-api-access-4mwrb\") pod \"nova-cell0-conductor-db-sync-c9pn5\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.200047 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.515305 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23b69c74-f914-4313-8366-e950afef5d8b","Type":"ContainerStarted","Data":"c1be237630f2669cf9ad6d6f81de8a3588d589b3c8e7ad433f75db04e419a32d"} Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.515625 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23b69c74-f914-4313-8366-e950afef5d8b","Type":"ContainerStarted","Data":"d5b50f272144f10c18b5e379d94f5e78bb35e11ab6fc8762bba9d2f2e83b2955"} Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.515635 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23b69c74-f914-4313-8366-e950afef5d8b","Type":"ContainerStarted","Data":"f4abb4a6c0f2c02d8f5ce7fd1be940f780e1594178bff2e12e2f58f63cf6957d"} Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.649278 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c9pn5"] Oct 03 18:34:02 crc kubenswrapper[4835]: W1003 18:34:02.650117 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9703db0d_8373_4427_98a0_c36d069f7c71.slice/crio-757119ed03f048d164cf308c69c73f080270273e915926828c826256e5593861 WatchSource:0}: Error finding container 757119ed03f048d164cf308c69c73f080270273e915926828c826256e5593861: Status 404 returned error can't find the container with id 757119ed03f048d164cf308c69c73f080270273e915926828c826256e5593861 Oct 03 18:34:02 crc kubenswrapper[4835]: I1003 18:34:02.761541 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:03 crc kubenswrapper[4835]: I1003 18:34:03.527568 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23b69c74-f914-4313-8366-e950afef5d8b","Type":"ContainerStarted","Data":"64be2f3926a109e141cf1168de1a31325f3de4760562c702dc7e674007b7f42c"} Oct 03 18:34:03 crc kubenswrapper[4835]: I1003 18:34:03.529199 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c9pn5" event={"ID":"9703db0d-8373-4427-98a0-c36d069f7c71","Type":"ContainerStarted","Data":"757119ed03f048d164cf308c69c73f080270273e915926828c826256e5593861"} Oct 03 18:34:04 crc kubenswrapper[4835]: I1003 18:34:04.547960 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23b69c74-f914-4313-8366-e950afef5d8b","Type":"ContainerStarted","Data":"e470513d82b5ed761fb9777d2b36e373b91e0fe6381a794600096870fa5cca44"} Oct 03 18:34:04 crc kubenswrapper[4835]: I1003 18:34:04.548098 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="ceilometer-central-agent" containerID="cri-o://d5b50f272144f10c18b5e379d94f5e78bb35e11ab6fc8762bba9d2f2e83b2955" gracePeriod=30 Oct 03 18:34:04 crc kubenswrapper[4835]: I1003 18:34:04.548154 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="sg-core" containerID="cri-o://64be2f3926a109e141cf1168de1a31325f3de4760562c702dc7e674007b7f42c" gracePeriod=30 Oct 03 18:34:04 crc kubenswrapper[4835]: I1003 18:34:04.548203 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="proxy-httpd" containerID="cri-o://e470513d82b5ed761fb9777d2b36e373b91e0fe6381a794600096870fa5cca44" gracePeriod=30 Oct 03 18:34:04 crc kubenswrapper[4835]: I1003 18:34:04.548227 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="ceilometer-notification-agent" containerID="cri-o://c1be237630f2669cf9ad6d6f81de8a3588d589b3c8e7ad433f75db04e419a32d" gracePeriod=30 Oct 03 18:34:04 crc kubenswrapper[4835]: I1003 18:34:04.548425 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 18:34:04 crc kubenswrapper[4835]: I1003 18:34:04.573739 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.040487204 podStartE2EDuration="4.573716445s" podCreationTimestamp="2025-10-03 18:34:00 +0000 UTC" firstStartedPulling="2025-10-03 18:34:01.655432797 +0000 UTC m=+1183.371373669" lastFinishedPulling="2025-10-03 18:34:04.188662038 +0000 UTC m=+1185.904602910" observedRunningTime="2025-10-03 18:34:04.570964368 +0000 UTC m=+1186.286905240" watchObservedRunningTime="2025-10-03 18:34:04.573716445 +0000 UTC m=+1186.289657327" Oct 03 18:34:05 crc kubenswrapper[4835]: I1003 18:34:05.564234 4835 generic.go:334] "Generic (PLEG): container finished" podID="23b69c74-f914-4313-8366-e950afef5d8b" containerID="e470513d82b5ed761fb9777d2b36e373b91e0fe6381a794600096870fa5cca44" exitCode=0 Oct 03 18:34:05 crc kubenswrapper[4835]: I1003 18:34:05.564554 4835 generic.go:334] "Generic (PLEG): container finished" podID="23b69c74-f914-4313-8366-e950afef5d8b" containerID="64be2f3926a109e141cf1168de1a31325f3de4760562c702dc7e674007b7f42c" exitCode=2 Oct 03 18:34:05 crc kubenswrapper[4835]: I1003 18:34:05.564768 4835 generic.go:334] "Generic (PLEG): container finished" podID="23b69c74-f914-4313-8366-e950afef5d8b" containerID="c1be237630f2669cf9ad6d6f81de8a3588d589b3c8e7ad433f75db04e419a32d" exitCode=0 Oct 03 18:34:05 crc kubenswrapper[4835]: I1003 18:34:05.564790 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23b69c74-f914-4313-8366-e950afef5d8b","Type":"ContainerDied","Data":"e470513d82b5ed761fb9777d2b36e373b91e0fe6381a794600096870fa5cca44"} Oct 03 18:34:05 crc kubenswrapper[4835]: I1003 18:34:05.564818 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23b69c74-f914-4313-8366-e950afef5d8b","Type":"ContainerDied","Data":"64be2f3926a109e141cf1168de1a31325f3de4760562c702dc7e674007b7f42c"} Oct 03 18:34:05 crc kubenswrapper[4835]: I1003 18:34:05.564861 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23b69c74-f914-4313-8366-e950afef5d8b","Type":"ContainerDied","Data":"c1be237630f2669cf9ad6d6f81de8a3588d589b3c8e7ad433f75db04e419a32d"} Oct 03 18:34:08 crc kubenswrapper[4835]: I1003 18:34:08.166194 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:08 crc kubenswrapper[4835]: I1003 18:34:08.167147 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:08 crc kubenswrapper[4835]: I1003 18:34:08.167279 4835 scope.go:117] "RemoveContainer" containerID="97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c" Oct 03 18:34:08 crc kubenswrapper[4835]: E1003 18:34:08.167508 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8110d0e5-9e19-4306-b8aa-babe937e8d2a)\"" pod="openstack/watcher-decision-engine-0" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" Oct 03 18:34:08 crc kubenswrapper[4835]: I1003 18:34:08.592143 4835 scope.go:117] "RemoveContainer" containerID="97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c" Oct 03 18:34:08 crc kubenswrapper[4835]: E1003 18:34:08.592358 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8110d0e5-9e19-4306-b8aa-babe937e8d2a)\"" pod="openstack/watcher-decision-engine-0" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" Oct 03 18:34:10 crc kubenswrapper[4835]: I1003 18:34:10.619205 4835 generic.go:334] "Generic (PLEG): container finished" podID="23b69c74-f914-4313-8366-e950afef5d8b" containerID="d5b50f272144f10c18b5e379d94f5e78bb35e11ab6fc8762bba9d2f2e83b2955" exitCode=0 Oct 03 18:34:10 crc kubenswrapper[4835]: I1003 18:34:10.619283 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23b69c74-f914-4313-8366-e950afef5d8b","Type":"ContainerDied","Data":"d5b50f272144f10c18b5e379d94f5e78bb35e11ab6fc8762bba9d2f2e83b2955"} Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.700595 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.777600 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-sg-core-conf-yaml\") pod \"23b69c74-f914-4313-8366-e950afef5d8b\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.777655 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-scripts\") pod \"23b69c74-f914-4313-8366-e950afef5d8b\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.777695 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhj5l\" (UniqueName: \"kubernetes.io/projected/23b69c74-f914-4313-8366-e950afef5d8b-kube-api-access-lhj5l\") pod \"23b69c74-f914-4313-8366-e950afef5d8b\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.777730 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-config-data\") pod \"23b69c74-f914-4313-8366-e950afef5d8b\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.777762 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23b69c74-f914-4313-8366-e950afef5d8b-run-httpd\") pod \"23b69c74-f914-4313-8366-e950afef5d8b\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.777819 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23b69c74-f914-4313-8366-e950afef5d8b-log-httpd\") pod \"23b69c74-f914-4313-8366-e950afef5d8b\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.777890 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-combined-ca-bundle\") pod \"23b69c74-f914-4313-8366-e950afef5d8b\" (UID: \"23b69c74-f914-4313-8366-e950afef5d8b\") " Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.778477 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23b69c74-f914-4313-8366-e950afef5d8b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "23b69c74-f914-4313-8366-e950afef5d8b" (UID: "23b69c74-f914-4313-8366-e950afef5d8b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.778656 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23b69c74-f914-4313-8366-e950afef5d8b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "23b69c74-f914-4313-8366-e950afef5d8b" (UID: "23b69c74-f914-4313-8366-e950afef5d8b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.785221 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-scripts" (OuterVolumeSpecName: "scripts") pod "23b69c74-f914-4313-8366-e950afef5d8b" (UID: "23b69c74-f914-4313-8366-e950afef5d8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.785338 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b69c74-f914-4313-8366-e950afef5d8b-kube-api-access-lhj5l" (OuterVolumeSpecName: "kube-api-access-lhj5l") pod "23b69c74-f914-4313-8366-e950afef5d8b" (UID: "23b69c74-f914-4313-8366-e950afef5d8b"). InnerVolumeSpecName "kube-api-access-lhj5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.807350 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "23b69c74-f914-4313-8366-e950afef5d8b" (UID: "23b69c74-f914-4313-8366-e950afef5d8b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.848321 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23b69c74-f914-4313-8366-e950afef5d8b" (UID: "23b69c74-f914-4313-8366-e950afef5d8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.880484 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.880525 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.880538 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.880549 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhj5l\" (UniqueName: \"kubernetes.io/projected/23b69c74-f914-4313-8366-e950afef5d8b-kube-api-access-lhj5l\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.880565 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23b69c74-f914-4313-8366-e950afef5d8b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.880574 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/23b69c74-f914-4313-8366-e950afef5d8b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.890245 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-config-data" (OuterVolumeSpecName: "config-data") pod "23b69c74-f914-4313-8366-e950afef5d8b" (UID: "23b69c74-f914-4313-8366-e950afef5d8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:11 crc kubenswrapper[4835]: I1003 18:34:11.982289 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b69c74-f914-4313-8366-e950afef5d8b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.639021 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c9pn5" event={"ID":"9703db0d-8373-4427-98a0-c36d069f7c71","Type":"ContainerStarted","Data":"3d4b88308c914465b10d8db996b8bb3ca48477dff927d18fdf94c1236e2a76dd"} Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.642197 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"23b69c74-f914-4313-8366-e950afef5d8b","Type":"ContainerDied","Data":"f4abb4a6c0f2c02d8f5ce7fd1be940f780e1594178bff2e12e2f58f63cf6957d"} Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.642245 4835 scope.go:117] "RemoveContainer" containerID="e470513d82b5ed761fb9777d2b36e373b91e0fe6381a794600096870fa5cca44" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.642299 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.664210 4835 scope.go:117] "RemoveContainer" containerID="64be2f3926a109e141cf1168de1a31325f3de4760562c702dc7e674007b7f42c" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.666381 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-c9pn5" podStartSLOduration=2.90261997 podStartE2EDuration="11.666360399s" podCreationTimestamp="2025-10-03 18:34:01 +0000 UTC" firstStartedPulling="2025-10-03 18:34:02.652546982 +0000 UTC m=+1184.368487854" lastFinishedPulling="2025-10-03 18:34:11.416287411 +0000 UTC m=+1193.132228283" observedRunningTime="2025-10-03 18:34:12.659194165 +0000 UTC m=+1194.375135037" watchObservedRunningTime="2025-10-03 18:34:12.666360399 +0000 UTC m=+1194.382301271" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.688660 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.689694 4835 scope.go:117] "RemoveContainer" containerID="c1be237630f2669cf9ad6d6f81de8a3588d589b3c8e7ad433f75db04e419a32d" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.697733 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.722020 4835 scope.go:117] "RemoveContainer" containerID="d5b50f272144f10c18b5e379d94f5e78bb35e11ab6fc8762bba9d2f2e83b2955" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.745841 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:12 crc kubenswrapper[4835]: E1003 18:34:12.746898 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="ceilometer-notification-agent" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.747004 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="ceilometer-notification-agent" Oct 03 18:34:12 crc kubenswrapper[4835]: E1003 18:34:12.747113 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="sg-core" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.747171 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="sg-core" Oct 03 18:34:12 crc kubenswrapper[4835]: E1003 18:34:12.747252 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="proxy-httpd" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.747381 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="proxy-httpd" Oct 03 18:34:12 crc kubenswrapper[4835]: E1003 18:34:12.747454 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="ceilometer-central-agent" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.747510 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="ceilometer-central-agent" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.748018 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="sg-core" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.748114 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="proxy-httpd" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.748263 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="ceilometer-central-agent" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.748403 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b69c74-f914-4313-8366-e950afef5d8b" containerName="ceilometer-notification-agent" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.755721 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.758347 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.759834 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.771713 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.795102 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r49m\" (UniqueName: \"kubernetes.io/projected/c36b0768-550f-4685-a011-cedfeaa5e318-kube-api-access-6r49m\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.795225 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.795313 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.795392 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c36b0768-550f-4685-a011-cedfeaa5e318-log-httpd\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.795431 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c36b0768-550f-4685-a011-cedfeaa5e318-run-httpd\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.795519 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-scripts\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.795675 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-config-data\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.887866 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b69c74-f914-4313-8366-e950afef5d8b" path="/var/lib/kubelet/pods/23b69c74-f914-4313-8366-e950afef5d8b/volumes" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.896744 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.896815 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c36b0768-550f-4685-a011-cedfeaa5e318-log-httpd\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.896843 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c36b0768-550f-4685-a011-cedfeaa5e318-run-httpd\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.896928 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-scripts\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.896974 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-config-data\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.897384 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c36b0768-550f-4685-a011-cedfeaa5e318-log-httpd\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.897715 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r49m\" (UniqueName: \"kubernetes.io/projected/c36b0768-550f-4685-a011-cedfeaa5e318-kube-api-access-6r49m\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.897815 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.897987 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c36b0768-550f-4685-a011-cedfeaa5e318-run-httpd\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.909426 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-scripts\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.909442 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.909709 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-config-data\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.909725 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:12 crc kubenswrapper[4835]: I1003 18:34:12.921244 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r49m\" (UniqueName: \"kubernetes.io/projected/c36b0768-550f-4685-a011-cedfeaa5e318-kube-api-access-6r49m\") pod \"ceilometer-0\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " pod="openstack/ceilometer-0" Oct 03 18:34:13 crc kubenswrapper[4835]: I1003 18:34:13.081665 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:34:13 crc kubenswrapper[4835]: I1003 18:34:13.523975 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:13 crc kubenswrapper[4835]: I1003 18:34:13.651610 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c36b0768-550f-4685-a011-cedfeaa5e318","Type":"ContainerStarted","Data":"4051baf5c2b8fae9d75cfc65db760a34da9d2cda77416321348cad0957bf7173"} Oct 03 18:34:14 crc kubenswrapper[4835]: I1003 18:34:14.664429 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c36b0768-550f-4685-a011-cedfeaa5e318","Type":"ContainerStarted","Data":"22c9abbf34f1e5b4fa9a6759e5cab36344412bc2fc35a4249b8fecb344a79143"} Oct 03 18:34:14 crc kubenswrapper[4835]: I1003 18:34:14.664741 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c36b0768-550f-4685-a011-cedfeaa5e318","Type":"ContainerStarted","Data":"80ab64113f8364c8841479a5505907262a22e6387e15c7df9907d8eed659ccbe"} Oct 03 18:34:15 crc kubenswrapper[4835]: I1003 18:34:15.674971 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c36b0768-550f-4685-a011-cedfeaa5e318","Type":"ContainerStarted","Data":"f1240b927d8e6c2f88c552f2985b74c410dd6c2176e311d2e798dcb2602ef4d3"} Oct 03 18:34:16 crc kubenswrapper[4835]: I1003 18:34:16.685867 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c36b0768-550f-4685-a011-cedfeaa5e318","Type":"ContainerStarted","Data":"d08afdd27820a50f188d68b3bd6c23fc1f3190ddced4856c4515b691dbf35b44"} Oct 03 18:34:16 crc kubenswrapper[4835]: I1003 18:34:16.686389 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 18:34:16 crc kubenswrapper[4835]: I1003 18:34:16.703708 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.311738215 podStartE2EDuration="4.703685699s" podCreationTimestamp="2025-10-03 18:34:12 +0000 UTC" firstStartedPulling="2025-10-03 18:34:13.533464763 +0000 UTC m=+1195.249405625" lastFinishedPulling="2025-10-03 18:34:15.925412237 +0000 UTC m=+1197.641353109" observedRunningTime="2025-10-03 18:34:16.701433683 +0000 UTC m=+1198.417374575" watchObservedRunningTime="2025-10-03 18:34:16.703685699 +0000 UTC m=+1198.419626571" Oct 03 18:34:19 crc kubenswrapper[4835]: I1003 18:34:19.809492 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:34:19 crc kubenswrapper[4835]: I1003 18:34:19.810098 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9e190761-0aec-4401-a034-6d490b395fff" containerName="glance-log" containerID="cri-o://1dc43fdb4cd3d3dec291423d45d74832fe7d50111ba1ce4278899ee31c21ba9e" gracePeriod=30 Oct 03 18:34:19 crc kubenswrapper[4835]: I1003 18:34:19.810198 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9e190761-0aec-4401-a034-6d490b395fff" containerName="glance-httpd" containerID="cri-o://cb752a1534060eb91749498398853c16725bd84e16743d4c1fb0ebf537841452" gracePeriod=30 Oct 03 18:34:20 crc kubenswrapper[4835]: I1003 18:34:20.744104 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e190761-0aec-4401-a034-6d490b395fff","Type":"ContainerDied","Data":"cb752a1534060eb91749498398853c16725bd84e16743d4c1fb0ebf537841452"} Oct 03 18:34:20 crc kubenswrapper[4835]: I1003 18:34:20.744619 4835 generic.go:334] "Generic (PLEG): container finished" podID="9e190761-0aec-4401-a034-6d490b395fff" containerID="cb752a1534060eb91749498398853c16725bd84e16743d4c1fb0ebf537841452" exitCode=0 Oct 03 18:34:20 crc kubenswrapper[4835]: I1003 18:34:20.744751 4835 generic.go:334] "Generic (PLEG): container finished" podID="9e190761-0aec-4401-a034-6d490b395fff" containerID="1dc43fdb4cd3d3dec291423d45d74832fe7d50111ba1ce4278899ee31c21ba9e" exitCode=143 Oct 03 18:34:20 crc kubenswrapper[4835]: I1003 18:34:20.744775 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e190761-0aec-4401-a034-6d490b395fff","Type":"ContainerDied","Data":"1dc43fdb4cd3d3dec291423d45d74832fe7d50111ba1ce4278899ee31c21ba9e"} Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.402203 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.473491 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e190761-0aec-4401-a034-6d490b395fff-logs\") pod \"9e190761-0aec-4401-a034-6d490b395fff\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.473636 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-public-tls-certs\") pod \"9e190761-0aec-4401-a034-6d490b395fff\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.473685 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e190761-0aec-4401-a034-6d490b395fff-httpd-run\") pod \"9e190761-0aec-4401-a034-6d490b395fff\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.473750 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvhwr\" (UniqueName: \"kubernetes.io/projected/9e190761-0aec-4401-a034-6d490b395fff-kube-api-access-mvhwr\") pod \"9e190761-0aec-4401-a034-6d490b395fff\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.474242 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e190761-0aec-4401-a034-6d490b395fff-logs" (OuterVolumeSpecName: "logs") pod "9e190761-0aec-4401-a034-6d490b395fff" (UID: "9e190761-0aec-4401-a034-6d490b395fff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.474274 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e190761-0aec-4401-a034-6d490b395fff-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9e190761-0aec-4401-a034-6d490b395fff" (UID: "9e190761-0aec-4401-a034-6d490b395fff"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.474587 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-config-data\") pod \"9e190761-0aec-4401-a034-6d490b395fff\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.474618 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-combined-ca-bundle\") pod \"9e190761-0aec-4401-a034-6d490b395fff\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.474647 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-scripts\") pod \"9e190761-0aec-4401-a034-6d490b395fff\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.474699 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"9e190761-0aec-4401-a034-6d490b395fff\" (UID: \"9e190761-0aec-4401-a034-6d490b395fff\") " Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.475187 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e190761-0aec-4401-a034-6d490b395fff-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.475198 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e190761-0aec-4401-a034-6d490b395fff-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.480574 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "9e190761-0aec-4401-a034-6d490b395fff" (UID: "9e190761-0aec-4401-a034-6d490b395fff"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.480602 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e190761-0aec-4401-a034-6d490b395fff-kube-api-access-mvhwr" (OuterVolumeSpecName: "kube-api-access-mvhwr") pod "9e190761-0aec-4401-a034-6d490b395fff" (UID: "9e190761-0aec-4401-a034-6d490b395fff"). InnerVolumeSpecName "kube-api-access-mvhwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.505225 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-scripts" (OuterVolumeSpecName: "scripts") pod "9e190761-0aec-4401-a034-6d490b395fff" (UID: "9e190761-0aec-4401-a034-6d490b395fff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.512273 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e190761-0aec-4401-a034-6d490b395fff" (UID: "9e190761-0aec-4401-a034-6d490b395fff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.539256 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9e190761-0aec-4401-a034-6d490b395fff" (UID: "9e190761-0aec-4401-a034-6d490b395fff"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.543859 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-config-data" (OuterVolumeSpecName: "config-data") pod "9e190761-0aec-4401-a034-6d490b395fff" (UID: "9e190761-0aec-4401-a034-6d490b395fff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.576854 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.576887 4835 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.576899 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvhwr\" (UniqueName: \"kubernetes.io/projected/9e190761-0aec-4401-a034-6d490b395fff-kube-api-access-mvhwr\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.576908 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.576919 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.576928 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e190761-0aec-4401-a034-6d490b395fff-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.598867 4835 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.678797 4835 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.756319 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e190761-0aec-4401-a034-6d490b395fff","Type":"ContainerDied","Data":"08d1dd8665d2426c908543c92fe39adc5d69189fa8a83db883ea56c172633f4c"} Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.756401 4835 scope.go:117] "RemoveContainer" containerID="cb752a1534060eb91749498398853c16725bd84e16743d4c1fb0ebf537841452" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.756450 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.789833 4835 scope.go:117] "RemoveContainer" containerID="1dc43fdb4cd3d3dec291423d45d74832fe7d50111ba1ce4278899ee31c21ba9e" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.797428 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.821218 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.837164 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.837420 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="04606cff-96b7-4cec-a55c-806267b559dc" containerName="glance-log" containerID="cri-o://0010c15727a0daf6e38ac8b5d622875a87191a587dfa9535b109c0c7f42377d8" gracePeriod=30 Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.837835 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="04606cff-96b7-4cec-a55c-806267b559dc" containerName="glance-httpd" containerID="cri-o://f7ebab970bb4e77e30fcf09a64074147373976e59473678be8aaee33eb783b59" gracePeriod=30 Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.851393 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:34:21 crc kubenswrapper[4835]: E1003 18:34:21.851922 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e190761-0aec-4401-a034-6d490b395fff" containerName="glance-httpd" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.851937 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e190761-0aec-4401-a034-6d490b395fff" containerName="glance-httpd" Oct 03 18:34:21 crc kubenswrapper[4835]: E1003 18:34:21.851991 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e190761-0aec-4401-a034-6d490b395fff" containerName="glance-log" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.851998 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e190761-0aec-4401-a034-6d490b395fff" containerName="glance-log" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.852234 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e190761-0aec-4401-a034-6d490b395fff" containerName="glance-httpd" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.852268 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e190761-0aec-4401-a034-6d490b395fff" containerName="glance-log" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.853548 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.856385 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.858213 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.864590 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.985224 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.985286 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6243f0de-fd35-43f6-8eaa-63836b03e125-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.985416 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6243f0de-fd35-43f6-8eaa-63836b03e125-config-data\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.985466 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6243f0de-fd35-43f6-8eaa-63836b03e125-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.985482 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6243f0de-fd35-43f6-8eaa-63836b03e125-logs\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.985497 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw7jv\" (UniqueName: \"kubernetes.io/projected/6243f0de-fd35-43f6-8eaa-63836b03e125-kube-api-access-tw7jv\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.985517 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6243f0de-fd35-43f6-8eaa-63836b03e125-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:21 crc kubenswrapper[4835]: I1003 18:34:21.985554 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6243f0de-fd35-43f6-8eaa-63836b03e125-scripts\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.087174 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6243f0de-fd35-43f6-8eaa-63836b03e125-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.087287 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6243f0de-fd35-43f6-8eaa-63836b03e125-config-data\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.087334 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6243f0de-fd35-43f6-8eaa-63836b03e125-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.087351 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6243f0de-fd35-43f6-8eaa-63836b03e125-logs\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.087370 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw7jv\" (UniqueName: \"kubernetes.io/projected/6243f0de-fd35-43f6-8eaa-63836b03e125-kube-api-access-tw7jv\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.087387 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6243f0de-fd35-43f6-8eaa-63836b03e125-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.087410 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6243f0de-fd35-43f6-8eaa-63836b03e125-scripts\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.087449 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.087643 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.089665 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6243f0de-fd35-43f6-8eaa-63836b03e125-logs\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.089706 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6243f0de-fd35-43f6-8eaa-63836b03e125-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.092774 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6243f0de-fd35-43f6-8eaa-63836b03e125-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.094140 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6243f0de-fd35-43f6-8eaa-63836b03e125-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.101749 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6243f0de-fd35-43f6-8eaa-63836b03e125-scripts\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.107417 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6243f0de-fd35-43f6-8eaa-63836b03e125-config-data\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.110534 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw7jv\" (UniqueName: \"kubernetes.io/projected/6243f0de-fd35-43f6-8eaa-63836b03e125-kube-api-access-tw7jv\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.133707 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"6243f0de-fd35-43f6-8eaa-63836b03e125\") " pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.188935 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.770232 4835 generic.go:334] "Generic (PLEG): container finished" podID="04606cff-96b7-4cec-a55c-806267b559dc" containerID="0010c15727a0daf6e38ac8b5d622875a87191a587dfa9535b109c0c7f42377d8" exitCode=143 Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.770351 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04606cff-96b7-4cec-a55c-806267b559dc","Type":"ContainerDied","Data":"0010c15727a0daf6e38ac8b5d622875a87191a587dfa9535b109c0c7f42377d8"} Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.857695 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.876625 4835 scope.go:117] "RemoveContainer" containerID="97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c" Oct 03 18:34:22 crc kubenswrapper[4835]: E1003 18:34:22.876838 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(8110d0e5-9e19-4306-b8aa-babe937e8d2a)\"" pod="openstack/watcher-decision-engine-0" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" Oct 03 18:34:22 crc kubenswrapper[4835]: I1003 18:34:22.887882 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e190761-0aec-4401-a034-6d490b395fff" path="/var/lib/kubelet/pods/9e190761-0aec-4401-a034-6d490b395fff/volumes" Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.062203 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.062495 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="ceilometer-central-agent" containerID="cri-o://80ab64113f8364c8841479a5505907262a22e6387e15c7df9907d8eed659ccbe" gracePeriod=30 Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.062564 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="sg-core" containerID="cri-o://f1240b927d8e6c2f88c552f2985b74c410dd6c2176e311d2e798dcb2602ef4d3" gracePeriod=30 Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.062693 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="ceilometer-notification-agent" containerID="cri-o://22c9abbf34f1e5b4fa9a6759e5cab36344412bc2fc35a4249b8fecb344a79143" gracePeriod=30 Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.062709 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="proxy-httpd" containerID="cri-o://d08afdd27820a50f188d68b3bd6c23fc1f3190ddced4856c4515b691dbf35b44" gracePeriod=30 Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.799777 4835 generic.go:334] "Generic (PLEG): container finished" podID="c36b0768-550f-4685-a011-cedfeaa5e318" containerID="d08afdd27820a50f188d68b3bd6c23fc1f3190ddced4856c4515b691dbf35b44" exitCode=0 Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.800214 4835 generic.go:334] "Generic (PLEG): container finished" podID="c36b0768-550f-4685-a011-cedfeaa5e318" containerID="f1240b927d8e6c2f88c552f2985b74c410dd6c2176e311d2e798dcb2602ef4d3" exitCode=2 Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.800231 4835 generic.go:334] "Generic (PLEG): container finished" podID="c36b0768-550f-4685-a011-cedfeaa5e318" containerID="80ab64113f8364c8841479a5505907262a22e6387e15c7df9907d8eed659ccbe" exitCode=0 Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.800457 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c36b0768-550f-4685-a011-cedfeaa5e318","Type":"ContainerDied","Data":"d08afdd27820a50f188d68b3bd6c23fc1f3190ddced4856c4515b691dbf35b44"} Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.800502 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c36b0768-550f-4685-a011-cedfeaa5e318","Type":"ContainerDied","Data":"f1240b927d8e6c2f88c552f2985b74c410dd6c2176e311d2e798dcb2602ef4d3"} Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.800515 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c36b0768-550f-4685-a011-cedfeaa5e318","Type":"ContainerDied","Data":"80ab64113f8364c8841479a5505907262a22e6387e15c7df9907d8eed659ccbe"} Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.802117 4835 generic.go:334] "Generic (PLEG): container finished" podID="04606cff-96b7-4cec-a55c-806267b559dc" containerID="f7ebab970bb4e77e30fcf09a64074147373976e59473678be8aaee33eb783b59" exitCode=0 Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.802163 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04606cff-96b7-4cec-a55c-806267b559dc","Type":"ContainerDied","Data":"f7ebab970bb4e77e30fcf09a64074147373976e59473678be8aaee33eb783b59"} Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.804348 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6243f0de-fd35-43f6-8eaa-63836b03e125","Type":"ContainerStarted","Data":"a3d4cb4793dffe003161764581358797d2c3691ac8f9f2ddf75d8ce7288a5098"} Oct 03 18:34:23 crc kubenswrapper[4835]: I1003 18:34:23.804370 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6243f0de-fd35-43f6-8eaa-63836b03e125","Type":"ContainerStarted","Data":"47d936528c60f3a887b1cd1c00579a9d1ce465e9734273cd3410dc77aeee326a"} Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.068331 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.229239 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"04606cff-96b7-4cec-a55c-806267b559dc\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.229292 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04606cff-96b7-4cec-a55c-806267b559dc-httpd-run\") pod \"04606cff-96b7-4cec-a55c-806267b559dc\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.229328 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-scripts\") pod \"04606cff-96b7-4cec-a55c-806267b559dc\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.229450 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04606cff-96b7-4cec-a55c-806267b559dc-logs\") pod \"04606cff-96b7-4cec-a55c-806267b559dc\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.229468 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-config-data\") pod \"04606cff-96b7-4cec-a55c-806267b559dc\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.229561 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7db9\" (UniqueName: \"kubernetes.io/projected/04606cff-96b7-4cec-a55c-806267b559dc-kube-api-access-k7db9\") pod \"04606cff-96b7-4cec-a55c-806267b559dc\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.229640 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04606cff-96b7-4cec-a55c-806267b559dc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "04606cff-96b7-4cec-a55c-806267b559dc" (UID: "04606cff-96b7-4cec-a55c-806267b559dc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.229650 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-combined-ca-bundle\") pod \"04606cff-96b7-4cec-a55c-806267b559dc\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.229740 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-internal-tls-certs\") pod \"04606cff-96b7-4cec-a55c-806267b559dc\" (UID: \"04606cff-96b7-4cec-a55c-806267b559dc\") " Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.230086 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04606cff-96b7-4cec-a55c-806267b559dc-logs" (OuterVolumeSpecName: "logs") pod "04606cff-96b7-4cec-a55c-806267b559dc" (UID: "04606cff-96b7-4cec-a55c-806267b559dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.230861 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04606cff-96b7-4cec-a55c-806267b559dc-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.230885 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04606cff-96b7-4cec-a55c-806267b559dc-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.235275 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04606cff-96b7-4cec-a55c-806267b559dc-kube-api-access-k7db9" (OuterVolumeSpecName: "kube-api-access-k7db9") pod "04606cff-96b7-4cec-a55c-806267b559dc" (UID: "04606cff-96b7-4cec-a55c-806267b559dc"). InnerVolumeSpecName "kube-api-access-k7db9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.235676 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "04606cff-96b7-4cec-a55c-806267b559dc" (UID: "04606cff-96b7-4cec-a55c-806267b559dc"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.239009 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-scripts" (OuterVolumeSpecName: "scripts") pod "04606cff-96b7-4cec-a55c-806267b559dc" (UID: "04606cff-96b7-4cec-a55c-806267b559dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.266940 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04606cff-96b7-4cec-a55c-806267b559dc" (UID: "04606cff-96b7-4cec-a55c-806267b559dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.289889 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "04606cff-96b7-4cec-a55c-806267b559dc" (UID: "04606cff-96b7-4cec-a55c-806267b559dc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.297193 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-config-data" (OuterVolumeSpecName: "config-data") pod "04606cff-96b7-4cec-a55c-806267b559dc" (UID: "04606cff-96b7-4cec-a55c-806267b559dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.332967 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.333258 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7db9\" (UniqueName: \"kubernetes.io/projected/04606cff-96b7-4cec-a55c-806267b559dc-kube-api-access-k7db9\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.333368 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.333439 4835 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.333546 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.333633 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04606cff-96b7-4cec-a55c-806267b559dc-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.352821 4835 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.435036 4835 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.814957 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04606cff-96b7-4cec-a55c-806267b559dc","Type":"ContainerDied","Data":"d65acc9a3ad64d0a66803b61a59ed5067fe358c4a7ef3a830748203468e4703d"} Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.815255 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.815308 4835 scope.go:117] "RemoveContainer" containerID="f7ebab970bb4e77e30fcf09a64074147373976e59473678be8aaee33eb783b59" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.822220 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6243f0de-fd35-43f6-8eaa-63836b03e125","Type":"ContainerStarted","Data":"db348150175bfdfcdc8e8a7bae1cceda7c035cd2e5e1e6a664d99495c311ebd1"} Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.838694 4835 scope.go:117] "RemoveContainer" containerID="0010c15727a0daf6e38ac8b5d622875a87191a587dfa9535b109c0c7f42377d8" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.841600 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.8415842639999997 podStartE2EDuration="3.841584264s" podCreationTimestamp="2025-10-03 18:34:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:34:24.839929334 +0000 UTC m=+1206.555870216" watchObservedRunningTime="2025-10-03 18:34:24.841584264 +0000 UTC m=+1206.557525126" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.864878 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.875951 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.888435 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04606cff-96b7-4cec-a55c-806267b559dc" path="/var/lib/kubelet/pods/04606cff-96b7-4cec-a55c-806267b559dc/volumes" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.889016 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:34:24 crc kubenswrapper[4835]: E1003 18:34:24.889364 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04606cff-96b7-4cec-a55c-806267b559dc" containerName="glance-log" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.889382 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="04606cff-96b7-4cec-a55c-806267b559dc" containerName="glance-log" Oct 03 18:34:24 crc kubenswrapper[4835]: E1003 18:34:24.889411 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04606cff-96b7-4cec-a55c-806267b559dc" containerName="glance-httpd" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.889417 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="04606cff-96b7-4cec-a55c-806267b559dc" containerName="glance-httpd" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.889587 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="04606cff-96b7-4cec-a55c-806267b559dc" containerName="glance-log" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.889609 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="04606cff-96b7-4cec-a55c-806267b559dc" containerName="glance-httpd" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.891061 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.894797 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.894845 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 18:34:24 crc kubenswrapper[4835]: I1003 18:34:24.904356 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.045685 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/399dbdfc-97f0-4d54-9e3e-a18ae490838c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.045785 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/399dbdfc-97f0-4d54-9e3e-a18ae490838c-logs\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.045919 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/399dbdfc-97f0-4d54-9e3e-a18ae490838c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.046263 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/399dbdfc-97f0-4d54-9e3e-a18ae490838c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.046388 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.046439 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/399dbdfc-97f0-4d54-9e3e-a18ae490838c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.046498 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399dbdfc-97f0-4d54-9e3e-a18ae490838c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.046537 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlngd\" (UniqueName: \"kubernetes.io/projected/399dbdfc-97f0-4d54-9e3e-a18ae490838c-kube-api-access-xlngd\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.148991 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/399dbdfc-97f0-4d54-9e3e-a18ae490838c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.149063 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399dbdfc-97f0-4d54-9e3e-a18ae490838c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.149106 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlngd\" (UniqueName: \"kubernetes.io/projected/399dbdfc-97f0-4d54-9e3e-a18ae490838c-kube-api-access-xlngd\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.149171 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/399dbdfc-97f0-4d54-9e3e-a18ae490838c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.149211 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/399dbdfc-97f0-4d54-9e3e-a18ae490838c-logs\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.149243 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/399dbdfc-97f0-4d54-9e3e-a18ae490838c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.149308 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/399dbdfc-97f0-4d54-9e3e-a18ae490838c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.149369 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.149854 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.150658 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/399dbdfc-97f0-4d54-9e3e-a18ae490838c-logs\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.150710 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/399dbdfc-97f0-4d54-9e3e-a18ae490838c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.153693 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/399dbdfc-97f0-4d54-9e3e-a18ae490838c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.154056 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/399dbdfc-97f0-4d54-9e3e-a18ae490838c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.165954 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399dbdfc-97f0-4d54-9e3e-a18ae490838c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.167091 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/399dbdfc-97f0-4d54-9e3e-a18ae490838c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.174818 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlngd\" (UniqueName: \"kubernetes.io/projected/399dbdfc-97f0-4d54-9e3e-a18ae490838c-kube-api-access-xlngd\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.193135 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"399dbdfc-97f0-4d54-9e3e-a18ae490838c\") " pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.209567 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.676991 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 18:34:25 crc kubenswrapper[4835]: W1003 18:34:25.678615 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod399dbdfc_97f0_4d54_9e3e_a18ae490838c.slice/crio-ce1bfaa91e4c3bf45c78390625067b31bd41ae312d49b0e553acd7df5b9f6f63 WatchSource:0}: Error finding container ce1bfaa91e4c3bf45c78390625067b31bd41ae312d49b0e553acd7df5b9f6f63: Status 404 returned error can't find the container with id ce1bfaa91e4c3bf45c78390625067b31bd41ae312d49b0e553acd7df5b9f6f63 Oct 03 18:34:25 crc kubenswrapper[4835]: I1003 18:34:25.837034 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"399dbdfc-97f0-4d54-9e3e-a18ae490838c","Type":"ContainerStarted","Data":"ce1bfaa91e4c3bf45c78390625067b31bd41ae312d49b0e553acd7df5b9f6f63"} Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.400572 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.482918 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c36b0768-550f-4685-a011-cedfeaa5e318-log-httpd\") pod \"c36b0768-550f-4685-a011-cedfeaa5e318\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.482982 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c36b0768-550f-4685-a011-cedfeaa5e318-run-httpd\") pod \"c36b0768-550f-4685-a011-cedfeaa5e318\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.483057 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r49m\" (UniqueName: \"kubernetes.io/projected/c36b0768-550f-4685-a011-cedfeaa5e318-kube-api-access-6r49m\") pod \"c36b0768-550f-4685-a011-cedfeaa5e318\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.483132 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-scripts\") pod \"c36b0768-550f-4685-a011-cedfeaa5e318\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.483172 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-sg-core-conf-yaml\") pod \"c36b0768-550f-4685-a011-cedfeaa5e318\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.483270 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-combined-ca-bundle\") pod \"c36b0768-550f-4685-a011-cedfeaa5e318\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.483292 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-config-data\") pod \"c36b0768-550f-4685-a011-cedfeaa5e318\" (UID: \"c36b0768-550f-4685-a011-cedfeaa5e318\") " Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.483512 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c36b0768-550f-4685-a011-cedfeaa5e318-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c36b0768-550f-4685-a011-cedfeaa5e318" (UID: "c36b0768-550f-4685-a011-cedfeaa5e318"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.483530 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c36b0768-550f-4685-a011-cedfeaa5e318-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c36b0768-550f-4685-a011-cedfeaa5e318" (UID: "c36b0768-550f-4685-a011-cedfeaa5e318"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.484153 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c36b0768-550f-4685-a011-cedfeaa5e318-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.484173 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c36b0768-550f-4685-a011-cedfeaa5e318-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.490040 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-scripts" (OuterVolumeSpecName: "scripts") pod "c36b0768-550f-4685-a011-cedfeaa5e318" (UID: "c36b0768-550f-4685-a011-cedfeaa5e318"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.490288 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c36b0768-550f-4685-a011-cedfeaa5e318-kube-api-access-6r49m" (OuterVolumeSpecName: "kube-api-access-6r49m") pod "c36b0768-550f-4685-a011-cedfeaa5e318" (UID: "c36b0768-550f-4685-a011-cedfeaa5e318"). InnerVolumeSpecName "kube-api-access-6r49m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.514689 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c36b0768-550f-4685-a011-cedfeaa5e318" (UID: "c36b0768-550f-4685-a011-cedfeaa5e318"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.554797 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c36b0768-550f-4685-a011-cedfeaa5e318" (UID: "c36b0768-550f-4685-a011-cedfeaa5e318"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.573239 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-config-data" (OuterVolumeSpecName: "config-data") pod "c36b0768-550f-4685-a011-cedfeaa5e318" (UID: "c36b0768-550f-4685-a011-cedfeaa5e318"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.585431 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r49m\" (UniqueName: \"kubernetes.io/projected/c36b0768-550f-4685-a011-cedfeaa5e318-kube-api-access-6r49m\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.585463 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.585475 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.585483 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.585491 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c36b0768-550f-4685-a011-cedfeaa5e318-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.859372 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"399dbdfc-97f0-4d54-9e3e-a18ae490838c","Type":"ContainerStarted","Data":"05fd7dd5797c3569d9fb2c047a106202fbf6a7b8f57b53baa31d284b8ea762a0"} Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.859680 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"399dbdfc-97f0-4d54-9e3e-a18ae490838c","Type":"ContainerStarted","Data":"a8e576d6cbd69c56c3143380c53d0d964df16351b0e831f9ab14f2331275c53c"} Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.864853 4835 generic.go:334] "Generic (PLEG): container finished" podID="c36b0768-550f-4685-a011-cedfeaa5e318" containerID="22c9abbf34f1e5b4fa9a6759e5cab36344412bc2fc35a4249b8fecb344a79143" exitCode=0 Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.864921 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c36b0768-550f-4685-a011-cedfeaa5e318","Type":"ContainerDied","Data":"22c9abbf34f1e5b4fa9a6759e5cab36344412bc2fc35a4249b8fecb344a79143"} Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.864947 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c36b0768-550f-4685-a011-cedfeaa5e318","Type":"ContainerDied","Data":"4051baf5c2b8fae9d75cfc65db760a34da9d2cda77416321348cad0957bf7173"} Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.864963 4835 scope.go:117] "RemoveContainer" containerID="d08afdd27820a50f188d68b3bd6c23fc1f3190ddced4856c4515b691dbf35b44" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.865103 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.885838 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.885799499 podStartE2EDuration="2.885799499s" podCreationTimestamp="2025-10-03 18:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:34:26.876567934 +0000 UTC m=+1208.592508806" watchObservedRunningTime="2025-10-03 18:34:26.885799499 +0000 UTC m=+1208.601740371" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.917204 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.929886 4835 scope.go:117] "RemoveContainer" containerID="f1240b927d8e6c2f88c552f2985b74c410dd6c2176e311d2e798dcb2602ef4d3" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.935746 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.946329 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:26 crc kubenswrapper[4835]: E1003 18:34:26.946768 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="proxy-httpd" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.946786 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="proxy-httpd" Oct 03 18:34:26 crc kubenswrapper[4835]: E1003 18:34:26.946796 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="ceilometer-central-agent" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.946803 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="ceilometer-central-agent" Oct 03 18:34:26 crc kubenswrapper[4835]: E1003 18:34:26.946820 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="sg-core" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.946827 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="sg-core" Oct 03 18:34:26 crc kubenswrapper[4835]: E1003 18:34:26.946849 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="ceilometer-notification-agent" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.946854 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="ceilometer-notification-agent" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.947086 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="ceilometer-notification-agent" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.947115 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="proxy-httpd" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.947131 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="sg-core" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.947146 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" containerName="ceilometer-central-agent" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.948868 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.951192 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.951938 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.957275 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.967819 4835 scope.go:117] "RemoveContainer" containerID="22c9abbf34f1e5b4fa9a6759e5cab36344412bc2fc35a4249b8fecb344a79143" Oct 03 18:34:26 crc kubenswrapper[4835]: I1003 18:34:26.997419 4835 scope.go:117] "RemoveContainer" containerID="80ab64113f8364c8841479a5505907262a22e6387e15c7df9907d8eed659ccbe" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.015929 4835 scope.go:117] "RemoveContainer" containerID="d08afdd27820a50f188d68b3bd6c23fc1f3190ddced4856c4515b691dbf35b44" Oct 03 18:34:27 crc kubenswrapper[4835]: E1003 18:34:27.017311 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d08afdd27820a50f188d68b3bd6c23fc1f3190ddced4856c4515b691dbf35b44\": container with ID starting with d08afdd27820a50f188d68b3bd6c23fc1f3190ddced4856c4515b691dbf35b44 not found: ID does not exist" containerID="d08afdd27820a50f188d68b3bd6c23fc1f3190ddced4856c4515b691dbf35b44" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.017356 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d08afdd27820a50f188d68b3bd6c23fc1f3190ddced4856c4515b691dbf35b44"} err="failed to get container status \"d08afdd27820a50f188d68b3bd6c23fc1f3190ddced4856c4515b691dbf35b44\": rpc error: code = NotFound desc = could not find container \"d08afdd27820a50f188d68b3bd6c23fc1f3190ddced4856c4515b691dbf35b44\": container with ID starting with d08afdd27820a50f188d68b3bd6c23fc1f3190ddced4856c4515b691dbf35b44 not found: ID does not exist" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.017383 4835 scope.go:117] "RemoveContainer" containerID="f1240b927d8e6c2f88c552f2985b74c410dd6c2176e311d2e798dcb2602ef4d3" Oct 03 18:34:27 crc kubenswrapper[4835]: E1003 18:34:27.017880 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1240b927d8e6c2f88c552f2985b74c410dd6c2176e311d2e798dcb2602ef4d3\": container with ID starting with f1240b927d8e6c2f88c552f2985b74c410dd6c2176e311d2e798dcb2602ef4d3 not found: ID does not exist" containerID="f1240b927d8e6c2f88c552f2985b74c410dd6c2176e311d2e798dcb2602ef4d3" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.017906 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1240b927d8e6c2f88c552f2985b74c410dd6c2176e311d2e798dcb2602ef4d3"} err="failed to get container status \"f1240b927d8e6c2f88c552f2985b74c410dd6c2176e311d2e798dcb2602ef4d3\": rpc error: code = NotFound desc = could not find container \"f1240b927d8e6c2f88c552f2985b74c410dd6c2176e311d2e798dcb2602ef4d3\": container with ID starting with f1240b927d8e6c2f88c552f2985b74c410dd6c2176e311d2e798dcb2602ef4d3 not found: ID does not exist" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.017919 4835 scope.go:117] "RemoveContainer" containerID="22c9abbf34f1e5b4fa9a6759e5cab36344412bc2fc35a4249b8fecb344a79143" Oct 03 18:34:27 crc kubenswrapper[4835]: E1003 18:34:27.018240 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c9abbf34f1e5b4fa9a6759e5cab36344412bc2fc35a4249b8fecb344a79143\": container with ID starting with 22c9abbf34f1e5b4fa9a6759e5cab36344412bc2fc35a4249b8fecb344a79143 not found: ID does not exist" containerID="22c9abbf34f1e5b4fa9a6759e5cab36344412bc2fc35a4249b8fecb344a79143" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.018264 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c9abbf34f1e5b4fa9a6759e5cab36344412bc2fc35a4249b8fecb344a79143"} err="failed to get container status \"22c9abbf34f1e5b4fa9a6759e5cab36344412bc2fc35a4249b8fecb344a79143\": rpc error: code = NotFound desc = could not find container \"22c9abbf34f1e5b4fa9a6759e5cab36344412bc2fc35a4249b8fecb344a79143\": container with ID starting with 22c9abbf34f1e5b4fa9a6759e5cab36344412bc2fc35a4249b8fecb344a79143 not found: ID does not exist" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.018278 4835 scope.go:117] "RemoveContainer" containerID="80ab64113f8364c8841479a5505907262a22e6387e15c7df9907d8eed659ccbe" Oct 03 18:34:27 crc kubenswrapper[4835]: E1003 18:34:27.018474 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ab64113f8364c8841479a5505907262a22e6387e15c7df9907d8eed659ccbe\": container with ID starting with 80ab64113f8364c8841479a5505907262a22e6387e15c7df9907d8eed659ccbe not found: ID does not exist" containerID="80ab64113f8364c8841479a5505907262a22e6387e15c7df9907d8eed659ccbe" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.018492 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ab64113f8364c8841479a5505907262a22e6387e15c7df9907d8eed659ccbe"} err="failed to get container status \"80ab64113f8364c8841479a5505907262a22e6387e15c7df9907d8eed659ccbe\": rpc error: code = NotFound desc = could not find container \"80ab64113f8364c8841479a5505907262a22e6387e15c7df9907d8eed659ccbe\": container with ID starting with 80ab64113f8364c8841479a5505907262a22e6387e15c7df9907d8eed659ccbe not found: ID does not exist" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.101270 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.101312 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e75e68d-44d9-475b-8a16-d8b9bf770678-log-httpd\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.101452 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-scripts\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.101496 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-config-data\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.101522 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp5tp\" (UniqueName: \"kubernetes.io/projected/3e75e68d-44d9-475b-8a16-d8b9bf770678-kube-api-access-sp5tp\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.101566 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e75e68d-44d9-475b-8a16-d8b9bf770678-run-httpd\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.101613 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.203997 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-scripts\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.204043 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-config-data\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.204061 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp5tp\" (UniqueName: \"kubernetes.io/projected/3e75e68d-44d9-475b-8a16-d8b9bf770678-kube-api-access-sp5tp\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.204109 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e75e68d-44d9-475b-8a16-d8b9bf770678-run-httpd\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.204149 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.204188 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.204618 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e75e68d-44d9-475b-8a16-d8b9bf770678-run-httpd\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.204698 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e75e68d-44d9-475b-8a16-d8b9bf770678-log-httpd\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.205243 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e75e68d-44d9-475b-8a16-d8b9bf770678-log-httpd\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.209686 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.210058 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.214609 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-config-data\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.219367 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-scripts\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.225403 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp5tp\" (UniqueName: \"kubernetes.io/projected/3e75e68d-44d9-475b-8a16-d8b9bf770678-kube-api-access-sp5tp\") pod \"ceilometer-0\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.268036 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.718426 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:34:27 crc kubenswrapper[4835]: W1003 18:34:27.725713 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e75e68d_44d9_475b_8a16_d8b9bf770678.slice/crio-a756470c0a90b277bc60e8f3c2818a9f8191567d64244261159c93ee48f949bc WatchSource:0}: Error finding container a756470c0a90b277bc60e8f3c2818a9f8191567d64244261159c93ee48f949bc: Status 404 returned error can't find the container with id a756470c0a90b277bc60e8f3c2818a9f8191567d64244261159c93ee48f949bc Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.728598 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.884877 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e75e68d-44d9-475b-8a16-d8b9bf770678","Type":"ContainerStarted","Data":"a756470c0a90b277bc60e8f3c2818a9f8191567d64244261159c93ee48f949bc"} Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.886988 4835 generic.go:334] "Generic (PLEG): container finished" podID="9703db0d-8373-4427-98a0-c36d069f7c71" containerID="3d4b88308c914465b10d8db996b8bb3ca48477dff927d18fdf94c1236e2a76dd" exitCode=0 Oct 03 18:34:27 crc kubenswrapper[4835]: I1003 18:34:27.887054 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c9pn5" event={"ID":"9703db0d-8373-4427-98a0-c36d069f7c71","Type":"ContainerDied","Data":"3d4b88308c914465b10d8db996b8bb3ca48477dff927d18fdf94c1236e2a76dd"} Oct 03 18:34:28 crc kubenswrapper[4835]: I1003 18:34:28.889602 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c36b0768-550f-4685-a011-cedfeaa5e318" path="/var/lib/kubelet/pods/c36b0768-550f-4685-a011-cedfeaa5e318/volumes" Oct 03 18:34:28 crc kubenswrapper[4835]: I1003 18:34:28.899241 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e75e68d-44d9-475b-8a16-d8b9bf770678","Type":"ContainerStarted","Data":"297027a217607a798f2318c367cfdbec57bc52f5554b2e9687796c1ae994e877"} Oct 03 18:34:28 crc kubenswrapper[4835]: I1003 18:34:28.899285 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e75e68d-44d9-475b-8a16-d8b9bf770678","Type":"ContainerStarted","Data":"a340fd7619fbe376031c0d7d73d4c9b53a33d1e323d355f67e7a454a8c325b69"} Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.394842 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.550645 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mwrb\" (UniqueName: \"kubernetes.io/projected/9703db0d-8373-4427-98a0-c36d069f7c71-kube-api-access-4mwrb\") pod \"9703db0d-8373-4427-98a0-c36d069f7c71\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.550742 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-combined-ca-bundle\") pod \"9703db0d-8373-4427-98a0-c36d069f7c71\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.550765 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-scripts\") pod \"9703db0d-8373-4427-98a0-c36d069f7c71\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.550853 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-config-data\") pod \"9703db0d-8373-4427-98a0-c36d069f7c71\" (UID: \"9703db0d-8373-4427-98a0-c36d069f7c71\") " Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.574717 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-scripts" (OuterVolumeSpecName: "scripts") pod "9703db0d-8373-4427-98a0-c36d069f7c71" (UID: "9703db0d-8373-4427-98a0-c36d069f7c71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.589906 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9703db0d-8373-4427-98a0-c36d069f7c71-kube-api-access-4mwrb" (OuterVolumeSpecName: "kube-api-access-4mwrb") pod "9703db0d-8373-4427-98a0-c36d069f7c71" (UID: "9703db0d-8373-4427-98a0-c36d069f7c71"). InnerVolumeSpecName "kube-api-access-4mwrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.593873 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-config-data" (OuterVolumeSpecName: "config-data") pod "9703db0d-8373-4427-98a0-c36d069f7c71" (UID: "9703db0d-8373-4427-98a0-c36d069f7c71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.608531 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9703db0d-8373-4427-98a0-c36d069f7c71" (UID: "9703db0d-8373-4427-98a0-c36d069f7c71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.652780 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mwrb\" (UniqueName: \"kubernetes.io/projected/9703db0d-8373-4427-98a0-c36d069f7c71-kube-api-access-4mwrb\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.652850 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.652861 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.652870 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9703db0d-8373-4427-98a0-c36d069f7c71-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.911796 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e75e68d-44d9-475b-8a16-d8b9bf770678","Type":"ContainerStarted","Data":"356926a0203256759bebd6f40cfe285c0084a92e52991134700c2c40a7507b90"} Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.913718 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-c9pn5" event={"ID":"9703db0d-8373-4427-98a0-c36d069f7c71","Type":"ContainerDied","Data":"757119ed03f048d164cf308c69c73f080270273e915926828c826256e5593861"} Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.913752 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-c9pn5" Oct 03 18:34:29 crc kubenswrapper[4835]: I1003 18:34:29.913767 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="757119ed03f048d164cf308c69c73f080270273e915926828c826256e5593861" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.019117 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 18:34:30 crc kubenswrapper[4835]: E1003 18:34:30.019607 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9703db0d-8373-4427-98a0-c36d069f7c71" containerName="nova-cell0-conductor-db-sync" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.019630 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9703db0d-8373-4427-98a0-c36d069f7c71" containerName="nova-cell0-conductor-db-sync" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.019914 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9703db0d-8373-4427-98a0-c36d069f7c71" containerName="nova-cell0-conductor-db-sync" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.020853 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.024489 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qmmml" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.024600 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.027564 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.161988 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ff1a76-f690-4565-962f-6768463be408-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"28ff1a76-f690-4565-962f-6768463be408\") " pod="openstack/nova-cell0-conductor-0" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.162188 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ff1a76-f690-4565-962f-6768463be408-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"28ff1a76-f690-4565-962f-6768463be408\") " pod="openstack/nova-cell0-conductor-0" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.162227 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht9vg\" (UniqueName: \"kubernetes.io/projected/28ff1a76-f690-4565-962f-6768463be408-kube-api-access-ht9vg\") pod \"nova-cell0-conductor-0\" (UID: \"28ff1a76-f690-4565-962f-6768463be408\") " pod="openstack/nova-cell0-conductor-0" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.264001 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ff1a76-f690-4565-962f-6768463be408-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"28ff1a76-f690-4565-962f-6768463be408\") " pod="openstack/nova-cell0-conductor-0" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.264040 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht9vg\" (UniqueName: \"kubernetes.io/projected/28ff1a76-f690-4565-962f-6768463be408-kube-api-access-ht9vg\") pod \"nova-cell0-conductor-0\" (UID: \"28ff1a76-f690-4565-962f-6768463be408\") " pod="openstack/nova-cell0-conductor-0" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.264187 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ff1a76-f690-4565-962f-6768463be408-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"28ff1a76-f690-4565-962f-6768463be408\") " pod="openstack/nova-cell0-conductor-0" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.269993 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ff1a76-f690-4565-962f-6768463be408-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"28ff1a76-f690-4565-962f-6768463be408\") " pod="openstack/nova-cell0-conductor-0" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.272554 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ff1a76-f690-4565-962f-6768463be408-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"28ff1a76-f690-4565-962f-6768463be408\") " pod="openstack/nova-cell0-conductor-0" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.282993 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht9vg\" (UniqueName: \"kubernetes.io/projected/28ff1a76-f690-4565-962f-6768463be408-kube-api-access-ht9vg\") pod \"nova-cell0-conductor-0\" (UID: \"28ff1a76-f690-4565-962f-6768463be408\") " pod="openstack/nova-cell0-conductor-0" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.347233 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.786619 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 18:34:30 crc kubenswrapper[4835]: W1003 18:34:30.790380 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28ff1a76_f690_4565_962f_6768463be408.slice/crio-6ac2c7fd0e314b7b5d53af875717aacbb2dd971a18561a74555e58fd9746f618 WatchSource:0}: Error finding container 6ac2c7fd0e314b7b5d53af875717aacbb2dd971a18561a74555e58fd9746f618: Status 404 returned error can't find the container with id 6ac2c7fd0e314b7b5d53af875717aacbb2dd971a18561a74555e58fd9746f618 Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.933824 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e75e68d-44d9-475b-8a16-d8b9bf770678","Type":"ContainerStarted","Data":"c781100bfeed11c33023ae039f6c8460894c23edcdd15494ffdf06817ab87dcb"} Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.934294 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.935373 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"28ff1a76-f690-4565-962f-6768463be408","Type":"ContainerStarted","Data":"6ac2c7fd0e314b7b5d53af875717aacbb2dd971a18561a74555e58fd9746f618"} Oct 03 18:34:30 crc kubenswrapper[4835]: I1003 18:34:30.957429 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.356432523 podStartE2EDuration="4.957407372s" podCreationTimestamp="2025-10-03 18:34:26 +0000 UTC" firstStartedPulling="2025-10-03 18:34:27.728327404 +0000 UTC m=+1209.444268276" lastFinishedPulling="2025-10-03 18:34:30.329302253 +0000 UTC m=+1212.045243125" observedRunningTime="2025-10-03 18:34:30.954926711 +0000 UTC m=+1212.670867593" watchObservedRunningTime="2025-10-03 18:34:30.957407372 +0000 UTC m=+1212.673348244" Oct 03 18:34:31 crc kubenswrapper[4835]: I1003 18:34:31.948629 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"28ff1a76-f690-4565-962f-6768463be408","Type":"ContainerStarted","Data":"de1460956e37f313c7b719fa04540ee9b6c3c31bbfb4f8b24ec0a19b51e1fcef"} Oct 03 18:34:31 crc kubenswrapper[4835]: I1003 18:34:31.949042 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 03 18:34:31 crc kubenswrapper[4835]: I1003 18:34:31.971251 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.971231573 podStartE2EDuration="2.971231573s" podCreationTimestamp="2025-10-03 18:34:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:34:31.968356463 +0000 UTC m=+1213.684297345" watchObservedRunningTime="2025-10-03 18:34:31.971231573 +0000 UTC m=+1213.687172465" Oct 03 18:34:32 crc kubenswrapper[4835]: I1003 18:34:32.189942 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 18:34:32 crc kubenswrapper[4835]: I1003 18:34:32.189989 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 18:34:32 crc kubenswrapper[4835]: I1003 18:34:32.225266 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 18:34:32 crc kubenswrapper[4835]: I1003 18:34:32.241001 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 18:34:32 crc kubenswrapper[4835]: I1003 18:34:32.954760 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 18:34:32 crc kubenswrapper[4835]: I1003 18:34:32.954794 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 18:34:34 crc kubenswrapper[4835]: I1003 18:34:34.817032 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 18:34:34 crc kubenswrapper[4835]: I1003 18:34:34.877540 4835 scope.go:117] "RemoveContainer" containerID="97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c" Oct 03 18:34:34 crc kubenswrapper[4835]: I1003 18:34:34.890737 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 18:34:35 crc kubenswrapper[4835]: I1003 18:34:35.210542 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 18:34:35 crc kubenswrapper[4835]: I1003 18:34:35.211002 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 18:34:35 crc kubenswrapper[4835]: I1003 18:34:35.240569 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 18:34:35 crc kubenswrapper[4835]: I1003 18:34:35.255235 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 18:34:36 crc kubenswrapper[4835]: I1003 18:34:36.000615 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8110d0e5-9e19-4306-b8aa-babe937e8d2a","Type":"ContainerStarted","Data":"ed51c69d7e999197964c6640b29e3b033c37805585c66315bbaab6947526f079"} Oct 03 18:34:36 crc kubenswrapper[4835]: I1003 18:34:36.001859 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 18:34:36 crc kubenswrapper[4835]: I1003 18:34:36.002152 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 18:34:37 crc kubenswrapper[4835]: I1003 18:34:37.722208 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 18:34:37 crc kubenswrapper[4835]: I1003 18:34:37.804464 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 18:34:38 crc kubenswrapper[4835]: I1003 18:34:38.165868 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:38 crc kubenswrapper[4835]: I1003 18:34:38.165920 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:38 crc kubenswrapper[4835]: I1003 18:34:38.191250 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:39 crc kubenswrapper[4835]: I1003 18:34:39.073317 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:39 crc kubenswrapper[4835]: I1003 18:34:39.107974 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 18:34:40 crc kubenswrapper[4835]: I1003 18:34:40.374708 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 03 18:34:40 crc kubenswrapper[4835]: I1003 18:34:40.801590 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-7jx5c"] Oct 03 18:34:40 crc kubenswrapper[4835]: I1003 18:34:40.802972 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:40 crc kubenswrapper[4835]: I1003 18:34:40.805003 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 03 18:34:40 crc kubenswrapper[4835]: I1003 18:34:40.805564 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 03 18:34:40 crc kubenswrapper[4835]: I1003 18:34:40.812955 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7jx5c"] Oct 03 18:34:40 crc kubenswrapper[4835]: I1003 18:34:40.969752 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-config-data\") pod \"nova-cell0-cell-mapping-7jx5c\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:40 crc kubenswrapper[4835]: I1003 18:34:40.969831 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7jx5c\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:40 crc kubenswrapper[4835]: I1003 18:34:40.969859 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-scripts\") pod \"nova-cell0-cell-mapping-7jx5c\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:40 crc kubenswrapper[4835]: I1003 18:34:40.969884 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp8sl\" (UniqueName: \"kubernetes.io/projected/ce857dee-2b33-413b-8040-6012915be992-kube-api-access-sp8sl\") pod \"nova-cell0-cell-mapping-7jx5c\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.006037 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.007392 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.015952 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.030171 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.032012 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.047793 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.071482 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-config-data\") pod \"nova-cell0-cell-mapping-7jx5c\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.071530 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7jx5c\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.071547 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-scripts\") pod \"nova-cell0-cell-mapping-7jx5c\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.071563 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp8sl\" (UniqueName: \"kubernetes.io/projected/ce857dee-2b33-413b-8040-6012915be992-kube-api-access-sp8sl\") pod \"nova-cell0-cell-mapping-7jx5c\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.077814 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" containerID="cri-o://ed51c69d7e999197964c6640b29e3b033c37805585c66315bbaab6947526f079" gracePeriod=30 Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.078127 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.088041 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-config-data\") pod \"nova-cell0-cell-mapping-7jx5c\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.091698 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-scripts\") pod \"nova-cell0-cell-mapping-7jx5c\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.096649 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7jx5c\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.118141 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp8sl\" (UniqueName: \"kubernetes.io/projected/ce857dee-2b33-413b-8040-6012915be992-kube-api-access-sp8sl\") pod \"nova-cell0-cell-mapping-7jx5c\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.121203 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.127994 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.161002 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.162605 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.167004 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.179210 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.179271 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6tfl\" (UniqueName: \"kubernetes.io/projected/e4eb715e-8684-471a-9383-f3be1ce5be53-kube-api-access-w6tfl\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4eb715e-8684-471a-9383-f3be1ce5be53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.179300 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-logs\") pod \"nova-api-0\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.179319 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs6kb\" (UniqueName: \"kubernetes.io/projected/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-kube-api-access-rs6kb\") pod \"nova-api-0\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.179338 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4eb715e-8684-471a-9383-f3be1ce5be53-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4eb715e-8684-471a-9383-f3be1ce5be53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.179381 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-config-data\") pod \"nova-api-0\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.179434 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4eb715e-8684-471a-9383-f3be1ce5be53-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4eb715e-8684-471a-9383-f3be1ce5be53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.188485 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.213775 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.219201 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.225640 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.250403 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.280922 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-config-data\") pod \"nova-api-0\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.280981 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4eb715e-8684-471a-9383-f3be1ce5be53-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4eb715e-8684-471a-9383-f3be1ce5be53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.281014 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a4652d-e363-4818-9561-589c8be56373-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46a4652d-e363-4818-9561-589c8be56373\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.281045 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6fx2\" (UniqueName: \"kubernetes.io/projected/46a4652d-e363-4818-9561-589c8be56373-kube-api-access-k6fx2\") pod \"nova-scheduler-0\" (UID: \"46a4652d-e363-4818-9561-589c8be56373\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.281121 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a4652d-e363-4818-9561-589c8be56373-config-data\") pod \"nova-scheduler-0\" (UID: \"46a4652d-e363-4818-9561-589c8be56373\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.281142 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.281172 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6tfl\" (UniqueName: \"kubernetes.io/projected/e4eb715e-8684-471a-9383-f3be1ce5be53-kube-api-access-w6tfl\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4eb715e-8684-471a-9383-f3be1ce5be53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.281196 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-logs\") pod \"nova-api-0\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.281213 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs6kb\" (UniqueName: \"kubernetes.io/projected/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-kube-api-access-rs6kb\") pod \"nova-api-0\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.281232 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4eb715e-8684-471a-9383-f3be1ce5be53-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4eb715e-8684-471a-9383-f3be1ce5be53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.284497 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-logs\") pod \"nova-api-0\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.285462 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4eb715e-8684-471a-9383-f3be1ce5be53-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4eb715e-8684-471a-9383-f3be1ce5be53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.286991 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-config-data\") pod \"nova-api-0\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.297845 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4eb715e-8684-471a-9383-f3be1ce5be53-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4eb715e-8684-471a-9383-f3be1ce5be53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.302717 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.313713 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6tfl\" (UniqueName: \"kubernetes.io/projected/e4eb715e-8684-471a-9383-f3be1ce5be53-kube-api-access-w6tfl\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4eb715e-8684-471a-9383-f3be1ce5be53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.314416 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f55868c59-5xsq8"] Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.316133 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.317567 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs6kb\" (UniqueName: \"kubernetes.io/projected/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-kube-api-access-rs6kb\") pod \"nova-api-0\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.327134 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f55868c59-5xsq8"] Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.333182 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.387257 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a4652d-e363-4818-9561-589c8be56373-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46a4652d-e363-4818-9561-589c8be56373\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.387428 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6fx2\" (UniqueName: \"kubernetes.io/projected/46a4652d-e363-4818-9561-589c8be56373-kube-api-access-k6fx2\") pod \"nova-scheduler-0\" (UID: \"46a4652d-e363-4818-9561-589c8be56373\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.389791 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a4652d-e363-4818-9561-589c8be56373-config-data\") pod \"nova-scheduler-0\" (UID: \"46a4652d-e363-4818-9561-589c8be56373\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.389833 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hmdx\" (UniqueName: \"kubernetes.io/projected/0cce3c9e-9deb-46e0-bc8d-f83725925996-kube-api-access-8hmdx\") pod \"nova-metadata-0\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.389874 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cce3c9e-9deb-46e0-bc8d-f83725925996-logs\") pod \"nova-metadata-0\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.390130 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cce3c9e-9deb-46e0-bc8d-f83725925996-config-data\") pod \"nova-metadata-0\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.390188 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cce3c9e-9deb-46e0-bc8d-f83725925996-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.391424 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a4652d-e363-4818-9561-589c8be56373-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46a4652d-e363-4818-9561-589c8be56373\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.395217 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a4652d-e363-4818-9561-589c8be56373-config-data\") pod \"nova-scheduler-0\" (UID: \"46a4652d-e363-4818-9561-589c8be56373\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.398864 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.419366 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6fx2\" (UniqueName: \"kubernetes.io/projected/46a4652d-e363-4818-9561-589c8be56373-kube-api-access-k6fx2\") pod \"nova-scheduler-0\" (UID: \"46a4652d-e363-4818-9561-589c8be56373\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.490774 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cce3c9e-9deb-46e0-bc8d-f83725925996-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.490841 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbb2s\" (UniqueName: \"kubernetes.io/projected/996f5308-25a4-41d7-a335-f255cd014871-kube-api-access-xbb2s\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.490889 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-ovsdbserver-nb\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.490929 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hmdx\" (UniqueName: \"kubernetes.io/projected/0cce3c9e-9deb-46e0-bc8d-f83725925996-kube-api-access-8hmdx\") pod \"nova-metadata-0\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.490948 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cce3c9e-9deb-46e0-bc8d-f83725925996-logs\") pod \"nova-metadata-0\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.490976 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-dns-svc\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.491007 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-config\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.491052 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-ovsdbserver-sb\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.491097 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cce3c9e-9deb-46e0-bc8d-f83725925996-config-data\") pod \"nova-metadata-0\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.491115 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-dns-swift-storage-0\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.491561 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cce3c9e-9deb-46e0-bc8d-f83725925996-logs\") pod \"nova-metadata-0\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.496168 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cce3c9e-9deb-46e0-bc8d-f83725925996-config-data\") pod \"nova-metadata-0\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.496813 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cce3c9e-9deb-46e0-bc8d-f83725925996-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.512764 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hmdx\" (UniqueName: \"kubernetes.io/projected/0cce3c9e-9deb-46e0-bc8d-f83725925996-kube-api-access-8hmdx\") pod \"nova-metadata-0\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.582870 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.592306 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbb2s\" (UniqueName: \"kubernetes.io/projected/996f5308-25a4-41d7-a335-f255cd014871-kube-api-access-xbb2s\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.592403 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-ovsdbserver-nb\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.592470 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-dns-svc\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.592506 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-config\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.592556 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-ovsdbserver-sb\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.592595 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-dns-swift-storage-0\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.593768 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-ovsdbserver-nb\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.593827 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-dns-svc\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.595821 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-config\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.595964 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-dns-swift-storage-0\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.597868 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-ovsdbserver-sb\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.614630 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbb2s\" (UniqueName: \"kubernetes.io/projected/996f5308-25a4-41d7-a335-f255cd014871-kube-api-access-xbb2s\") pod \"dnsmasq-dns-7f55868c59-5xsq8\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.687738 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.704669 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.846807 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7jx5c"] Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.989215 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7kxdv"] Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.990888 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.993559 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 18:34:41 crc kubenswrapper[4835]: I1003 18:34:41.996423 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.007042 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7kxdv"] Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.014284 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-scripts\") pod \"nova-cell1-conductor-db-sync-7kxdv\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.014343 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7kxdv\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.014419 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbp7q\" (UniqueName: \"kubernetes.io/projected/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-kube-api-access-dbp7q\") pod \"nova-cell1-conductor-db-sync-7kxdv\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.014440 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-config-data\") pod \"nova-cell1-conductor-db-sync-7kxdv\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.018124 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.094107 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e4eb715e-8684-471a-9383-f3be1ce5be53","Type":"ContainerStarted","Data":"bb1b0fcafe138d6c998cdc473db6c1ab4d547aee0fb700eb653409d3ba09210d"} Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.095512 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7jx5c" event={"ID":"ce857dee-2b33-413b-8040-6012915be992","Type":"ContainerStarted","Data":"f0fce01a89dfe7517f00e76998ec20f0697e98edad723c677ba1af0dfdb09c2d"} Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.098326 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.121361 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7kxdv\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.121467 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbp7q\" (UniqueName: \"kubernetes.io/projected/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-kube-api-access-dbp7q\") pod \"nova-cell1-conductor-db-sync-7kxdv\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.121489 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-config-data\") pod \"nova-cell1-conductor-db-sync-7kxdv\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.121573 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-scripts\") pod \"nova-cell1-conductor-db-sync-7kxdv\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.139202 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-scripts\") pod \"nova-cell1-conductor-db-sync-7kxdv\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.139384 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7kxdv\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.148411 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-config-data\") pod \"nova-cell1-conductor-db-sync-7kxdv\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.163766 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbp7q\" (UniqueName: \"kubernetes.io/projected/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-kube-api-access-dbp7q\") pod \"nova-cell1-conductor-db-sync-7kxdv\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.300268 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.315560 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.348414 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.575711 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f55868c59-5xsq8"] Oct 03 18:34:42 crc kubenswrapper[4835]: I1003 18:34:42.907899 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7kxdv"] Oct 03 18:34:42 crc kubenswrapper[4835]: W1003 18:34:42.932148 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbaaef5c_4627_46d0_8673_1cf9767ab4d6.slice/crio-97b02b66e56ab44966d7bc8eef3653cd11d49f2f51d926373b17ebe59fbf8506 WatchSource:0}: Error finding container 97b02b66e56ab44966d7bc8eef3653cd11d49f2f51d926373b17ebe59fbf8506: Status 404 returned error can't find the container with id 97b02b66e56ab44966d7bc8eef3653cd11d49f2f51d926373b17ebe59fbf8506 Oct 03 18:34:43 crc kubenswrapper[4835]: I1003 18:34:43.110120 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0cce3c9e-9deb-46e0-bc8d-f83725925996","Type":"ContainerStarted","Data":"ab0a527c66a22cb1333c9abe1d0fcac3568bbbbd6e42d6ebdc5af2de960edb64"} Oct 03 18:34:43 crc kubenswrapper[4835]: I1003 18:34:43.111587 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46a4652d-e363-4818-9561-589c8be56373","Type":"ContainerStarted","Data":"7f034a214f2832d370b02b3d1c51e8f5586d83c9a05f4dcff110d2f632178582"} Oct 03 18:34:43 crc kubenswrapper[4835]: I1003 18:34:43.113209 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7kxdv" event={"ID":"bbaaef5c-4627-46d0-8673-1cf9767ab4d6","Type":"ContainerStarted","Data":"97b02b66e56ab44966d7bc8eef3653cd11d49f2f51d926373b17ebe59fbf8506"} Oct 03 18:34:43 crc kubenswrapper[4835]: I1003 18:34:43.115711 4835 generic.go:334] "Generic (PLEG): container finished" podID="996f5308-25a4-41d7-a335-f255cd014871" containerID="3930ed02ba81d90d039f9144c83eb9f8200b06eabee3a3fbe09fdc92d6926d28" exitCode=0 Oct 03 18:34:43 crc kubenswrapper[4835]: I1003 18:34:43.115797 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" event={"ID":"996f5308-25a4-41d7-a335-f255cd014871","Type":"ContainerDied","Data":"3930ed02ba81d90d039f9144c83eb9f8200b06eabee3a3fbe09fdc92d6926d28"} Oct 03 18:34:43 crc kubenswrapper[4835]: I1003 18:34:43.115850 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" event={"ID":"996f5308-25a4-41d7-a335-f255cd014871","Type":"ContainerStarted","Data":"9c6f36a01ba2a30ee63c0500f0d50db2bbe2e8cb4938cb858d3374ee01cea314"} Oct 03 18:34:43 crc kubenswrapper[4835]: I1003 18:34:43.120493 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf4e074a-b323-4dd7-98b4-5e18e3ab6246","Type":"ContainerStarted","Data":"ec0ae7ac8dfaf3c83e8fcd83acdfd2a943007d6248da7b978cc74d7f2492eb77"} Oct 03 18:34:43 crc kubenswrapper[4835]: I1003 18:34:43.131780 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7jx5c" event={"ID":"ce857dee-2b33-413b-8040-6012915be992","Type":"ContainerStarted","Data":"668e0906abe850415158c66fcbd94a09b8b5348b23de4d3d223e7dd30674d154"} Oct 03 18:34:43 crc kubenswrapper[4835]: I1003 18:34:43.203440 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-7jx5c" podStartSLOduration=3.203420637 podStartE2EDuration="3.203420637s" podCreationTimestamp="2025-10-03 18:34:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:34:43.159495259 +0000 UTC m=+1224.875436151" watchObservedRunningTime="2025-10-03 18:34:43.203420637 +0000 UTC m=+1224.919361509" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.154248 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" event={"ID":"996f5308-25a4-41d7-a335-f255cd014871","Type":"ContainerStarted","Data":"f90cd51b5dc91ed8d3c7b76cb5f4f97e99225eeed2f3ca1f2e4a6e6aafa83cd8"} Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.154842 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.158435 4835 generic.go:334] "Generic (PLEG): container finished" podID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerID="ed51c69d7e999197964c6640b29e3b033c37805585c66315bbaab6947526f079" exitCode=0 Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.158493 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8110d0e5-9e19-4306-b8aa-babe937e8d2a","Type":"ContainerDied","Data":"ed51c69d7e999197964c6640b29e3b033c37805585c66315bbaab6947526f079"} Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.158534 4835 scope.go:117] "RemoveContainer" containerID="97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.160322 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7kxdv" event={"ID":"bbaaef5c-4627-46d0-8673-1cf9767ab4d6","Type":"ContainerStarted","Data":"89f2716fb9948d66f1da8169f84213d5c6f6d515d31ef7074dcd34df44aa9f39"} Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.183682 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" podStartSLOduration=3.183662702 podStartE2EDuration="3.183662702s" podCreationTimestamp="2025-10-03 18:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:34:44.171167578 +0000 UTC m=+1225.887108450" watchObservedRunningTime="2025-10-03 18:34:44.183662702 +0000 UTC m=+1225.899603574" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.206192 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7kxdv" podStartSLOduration=3.206171979 podStartE2EDuration="3.206171979s" podCreationTimestamp="2025-10-03 18:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:34:44.203243168 +0000 UTC m=+1225.919184060" watchObservedRunningTime="2025-10-03 18:34:44.206171979 +0000 UTC m=+1225.922112851" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.302521 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.392884 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8110d0e5-9e19-4306-b8aa-babe937e8d2a-logs\") pod \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.392934 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-combined-ca-bundle\") pod \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.392955 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-config-data\") pod \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.393037 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-custom-prometheus-ca\") pod \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.393185 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjdxn\" (UniqueName: \"kubernetes.io/projected/8110d0e5-9e19-4306-b8aa-babe937e8d2a-kube-api-access-hjdxn\") pod \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\" (UID: \"8110d0e5-9e19-4306-b8aa-babe937e8d2a\") " Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.393589 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8110d0e5-9e19-4306-b8aa-babe937e8d2a-logs" (OuterVolumeSpecName: "logs") pod "8110d0e5-9e19-4306-b8aa-babe937e8d2a" (UID: "8110d0e5-9e19-4306-b8aa-babe937e8d2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.393715 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8110d0e5-9e19-4306-b8aa-babe937e8d2a-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.428537 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8110d0e5-9e19-4306-b8aa-babe937e8d2a" (UID: "8110d0e5-9e19-4306-b8aa-babe937e8d2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.428542 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8110d0e5-9e19-4306-b8aa-babe937e8d2a-kube-api-access-hjdxn" (OuterVolumeSpecName: "kube-api-access-hjdxn") pod "8110d0e5-9e19-4306-b8aa-babe937e8d2a" (UID: "8110d0e5-9e19-4306-b8aa-babe937e8d2a"). InnerVolumeSpecName "kube-api-access-hjdxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.431165 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "8110d0e5-9e19-4306-b8aa-babe937e8d2a" (UID: "8110d0e5-9e19-4306-b8aa-babe937e8d2a"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.460925 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-config-data" (OuterVolumeSpecName: "config-data") pod "8110d0e5-9e19-4306-b8aa-babe937e8d2a" (UID: "8110d0e5-9e19-4306-b8aa-babe937e8d2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.495209 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.495237 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.495248 4835 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8110d0e5-9e19-4306-b8aa-babe937e8d2a-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:44 crc kubenswrapper[4835]: I1003 18:34:44.495259 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjdxn\" (UniqueName: \"kubernetes.io/projected/8110d0e5-9e19-4306-b8aa-babe937e8d2a-kube-api-access-hjdxn\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.160300 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.168504 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.173195 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8110d0e5-9e19-4306-b8aa-babe937e8d2a","Type":"ContainerDied","Data":"a65f3c32097efdcccd2d76df11f8f8af766a5fe3f2b6de5c70d7fbdf3bbbf7aa"} Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.174300 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.194778 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.210381 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.225298 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 18:34:45 crc kubenswrapper[4835]: E1003 18:34:45.225852 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.226048 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:45 crc kubenswrapper[4835]: E1003 18:34:45.226088 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.226094 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:45 crc kubenswrapper[4835]: E1003 18:34:45.226108 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.226113 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:45 crc kubenswrapper[4835]: E1003 18:34:45.226128 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.226135 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.226304 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.226315 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.226329 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.227062 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.229808 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.234315 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.314452 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/929877a5-090b-46c5-ac19-f2ba3c72231f-config-data\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.314695 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz4ln\" (UniqueName: \"kubernetes.io/projected/929877a5-090b-46c5-ac19-f2ba3c72231f-kube-api-access-zz4ln\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.314906 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/929877a5-090b-46c5-ac19-f2ba3c72231f-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.315049 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/929877a5-090b-46c5-ac19-f2ba3c72231f-logs\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.315095 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929877a5-090b-46c5-ac19-f2ba3c72231f-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.416517 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/929877a5-090b-46c5-ac19-f2ba3c72231f-logs\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.416599 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929877a5-090b-46c5-ac19-f2ba3c72231f-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.416657 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/929877a5-090b-46c5-ac19-f2ba3c72231f-config-data\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.416984 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz4ln\" (UniqueName: \"kubernetes.io/projected/929877a5-090b-46c5-ac19-f2ba3c72231f-kube-api-access-zz4ln\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.417130 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/929877a5-090b-46c5-ac19-f2ba3c72231f-logs\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.417608 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/929877a5-090b-46c5-ac19-f2ba3c72231f-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.434690 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/929877a5-090b-46c5-ac19-f2ba3c72231f-config-data\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.435881 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz4ln\" (UniqueName: \"kubernetes.io/projected/929877a5-090b-46c5-ac19-f2ba3c72231f-kube-api-access-zz4ln\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.447691 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/929877a5-090b-46c5-ac19-f2ba3c72231f-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.456248 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929877a5-090b-46c5-ac19-f2ba3c72231f-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"929877a5-090b-46c5-ac19-f2ba3c72231f\") " pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.542324 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:45 crc kubenswrapper[4835]: I1003 18:34:45.946774 4835 scope.go:117] "RemoveContainer" containerID="ed51c69d7e999197964c6640b29e3b033c37805585c66315bbaab6947526f079" Oct 03 18:34:46 crc kubenswrapper[4835]: I1003 18:34:46.010420 4835 scope.go:117] "RemoveContainer" containerID="97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c" Oct 03 18:34:46 crc kubenswrapper[4835]: E1003 18:34:46.011937 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c\": container with ID starting with 97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c not found: ID does not exist" containerID="97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c" Oct 03 18:34:46 crc kubenswrapper[4835]: I1003 18:34:46.011976 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c"} err="failed to get container status \"97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c\": rpc error: code = NotFound desc = could not find container \"97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c\": container with ID starting with 97bedbf1bb36085eae3d24b180930579d4c56cbcda8064d853fa1177dcc37e0c not found: ID does not exist" Oct 03 18:34:46 crc kubenswrapper[4835]: I1003 18:34:46.514701 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 18:34:46 crc kubenswrapper[4835]: I1003 18:34:46.892520 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" path="/var/lib/kubelet/pods/8110d0e5-9e19-4306-b8aa-babe937e8d2a/volumes" Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.200862 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e4eb715e-8684-471a-9383-f3be1ce5be53","Type":"ContainerStarted","Data":"68d1701767f131fd92108bcbb6b0653e53df73c66fdf48876d8f697a7fa137e9"} Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.200980 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e4eb715e-8684-471a-9383-f3be1ce5be53" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://68d1701767f131fd92108bcbb6b0653e53df73c66fdf48876d8f697a7fa137e9" gracePeriod=30 Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.204533 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"929877a5-090b-46c5-ac19-f2ba3c72231f","Type":"ContainerStarted","Data":"dd2cac8d5aeb628ea601ad0b441edebe791e3a95f0f7955f4cd58ef7255d208e"} Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.204568 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"929877a5-090b-46c5-ac19-f2ba3c72231f","Type":"ContainerStarted","Data":"aad3e4064d26d61b66547b414a26d768cd60f059acb0f70bf1767c7b622e5865"} Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.212155 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0cce3c9e-9deb-46e0-bc8d-f83725925996","Type":"ContainerStarted","Data":"a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19"} Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.212201 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0cce3c9e-9deb-46e0-bc8d-f83725925996","Type":"ContainerStarted","Data":"896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214"} Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.212304 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0cce3c9e-9deb-46e0-bc8d-f83725925996" containerName="nova-metadata-log" containerID="cri-o://896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214" gracePeriod=30 Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.212395 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0cce3c9e-9deb-46e0-bc8d-f83725925996" containerName="nova-metadata-metadata" containerID="cri-o://a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19" gracePeriod=30 Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.224029 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46a4652d-e363-4818-9561-589c8be56373","Type":"ContainerStarted","Data":"268eafef82b4556e3766dc232b2119cd77f3d227316f2b043a1750902060d4bf"} Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.230574 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.219086079 podStartE2EDuration="7.230551268s" podCreationTimestamp="2025-10-03 18:34:40 +0000 UTC" firstStartedPulling="2025-10-03 18:34:42.015144753 +0000 UTC m=+1223.731085625" lastFinishedPulling="2025-10-03 18:34:46.026609942 +0000 UTC m=+1227.742550814" observedRunningTime="2025-10-03 18:34:47.220539934 +0000 UTC m=+1228.936480806" watchObservedRunningTime="2025-10-03 18:34:47.230551268 +0000 UTC m=+1228.946492140" Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.231001 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf4e074a-b323-4dd7-98b4-5e18e3ab6246","Type":"ContainerStarted","Data":"b9be68c011e4c3d26aa7c6575cc2ce9441fcb60c1e7541330ef68c599ba12c90"} Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.231053 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf4e074a-b323-4dd7-98b4-5e18e3ab6246","Type":"ContainerStarted","Data":"2880256aef8b5fe323c35578ece5a0d31b1aabba95a3adf0e534504e3e906ed1"} Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.249554 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.653830274 podStartE2EDuration="6.24953381s" podCreationTimestamp="2025-10-03 18:34:41 +0000 UTC" firstStartedPulling="2025-10-03 18:34:42.43190801 +0000 UTC m=+1224.147848882" lastFinishedPulling="2025-10-03 18:34:46.027611546 +0000 UTC m=+1227.743552418" observedRunningTime="2025-10-03 18:34:47.23761094 +0000 UTC m=+1228.953552092" watchObservedRunningTime="2025-10-03 18:34:47.24953381 +0000 UTC m=+1228.965474682" Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.270309 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.270286035 podStartE2EDuration="2.270286035s" podCreationTimestamp="2025-10-03 18:34:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:34:47.261021389 +0000 UTC m=+1228.976962251" watchObservedRunningTime="2025-10-03 18:34:47.270286035 +0000 UTC m=+1228.986226907" Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.284080 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.41395676 podStartE2EDuration="7.28404931s" podCreationTimestamp="2025-10-03 18:34:40 +0000 UTC" firstStartedPulling="2025-10-03 18:34:42.156514542 +0000 UTC m=+1223.872455414" lastFinishedPulling="2025-10-03 18:34:46.026607092 +0000 UTC m=+1227.742547964" observedRunningTime="2025-10-03 18:34:47.279362325 +0000 UTC m=+1228.995303207" watchObservedRunningTime="2025-10-03 18:34:47.28404931 +0000 UTC m=+1228.999990182" Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.298302 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.656078558 podStartE2EDuration="7.298287746s" podCreationTimestamp="2025-10-03 18:34:40 +0000 UTC" firstStartedPulling="2025-10-03 18:34:42.384341033 +0000 UTC m=+1224.100281905" lastFinishedPulling="2025-10-03 18:34:46.026550221 +0000 UTC m=+1227.742491093" observedRunningTime="2025-10-03 18:34:47.296933003 +0000 UTC m=+1229.012873875" watchObservedRunningTime="2025-10-03 18:34:47.298287746 +0000 UTC m=+1229.014228618" Oct 03 18:34:47 crc kubenswrapper[4835]: E1003 18:34:47.459230 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cce3c9e_9deb_46e0_bc8d_f83725925996.slice/crio-896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cce3c9e_9deb_46e0_bc8d_f83725925996.slice/crio-conmon-a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19.scope\": RecentStats: unable to find data in memory cache]" Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.931812 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.987130 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cce3c9e-9deb-46e0-bc8d-f83725925996-combined-ca-bundle\") pod \"0cce3c9e-9deb-46e0-bc8d-f83725925996\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.988208 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cce3c9e-9deb-46e0-bc8d-f83725925996-logs\") pod \"0cce3c9e-9deb-46e0-bc8d-f83725925996\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.988563 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cce3c9e-9deb-46e0-bc8d-f83725925996-config-data\") pod \"0cce3c9e-9deb-46e0-bc8d-f83725925996\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.989082 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cce3c9e-9deb-46e0-bc8d-f83725925996-logs" (OuterVolumeSpecName: "logs") pod "0cce3c9e-9deb-46e0-bc8d-f83725925996" (UID: "0cce3c9e-9deb-46e0-bc8d-f83725925996"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.989535 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hmdx\" (UniqueName: \"kubernetes.io/projected/0cce3c9e-9deb-46e0-bc8d-f83725925996-kube-api-access-8hmdx\") pod \"0cce3c9e-9deb-46e0-bc8d-f83725925996\" (UID: \"0cce3c9e-9deb-46e0-bc8d-f83725925996\") " Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.993373 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cce3c9e-9deb-46e0-bc8d-f83725925996-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:47 crc kubenswrapper[4835]: I1003 18:34:47.995216 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cce3c9e-9deb-46e0-bc8d-f83725925996-kube-api-access-8hmdx" (OuterVolumeSpecName: "kube-api-access-8hmdx") pod "0cce3c9e-9deb-46e0-bc8d-f83725925996" (UID: "0cce3c9e-9deb-46e0-bc8d-f83725925996"). InnerVolumeSpecName "kube-api-access-8hmdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.042221 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cce3c9e-9deb-46e0-bc8d-f83725925996-config-data" (OuterVolumeSpecName: "config-data") pod "0cce3c9e-9deb-46e0-bc8d-f83725925996" (UID: "0cce3c9e-9deb-46e0-bc8d-f83725925996"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.075313 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cce3c9e-9deb-46e0-bc8d-f83725925996-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cce3c9e-9deb-46e0-bc8d-f83725925996" (UID: "0cce3c9e-9deb-46e0-bc8d-f83725925996"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.095619 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cce3c9e-9deb-46e0-bc8d-f83725925996-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.095649 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hmdx\" (UniqueName: \"kubernetes.io/projected/0cce3c9e-9deb-46e0-bc8d-f83725925996-kube-api-access-8hmdx\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.095664 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cce3c9e-9deb-46e0-bc8d-f83725925996-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.241429 4835 generic.go:334] "Generic (PLEG): container finished" podID="0cce3c9e-9deb-46e0-bc8d-f83725925996" containerID="a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19" exitCode=0 Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.241464 4835 generic.go:334] "Generic (PLEG): container finished" podID="0cce3c9e-9deb-46e0-bc8d-f83725925996" containerID="896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214" exitCode=143 Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.242152 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.246283 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0cce3c9e-9deb-46e0-bc8d-f83725925996","Type":"ContainerDied","Data":"a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19"} Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.246355 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0cce3c9e-9deb-46e0-bc8d-f83725925996","Type":"ContainerDied","Data":"896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214"} Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.246372 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0cce3c9e-9deb-46e0-bc8d-f83725925996","Type":"ContainerDied","Data":"ab0a527c66a22cb1333c9abe1d0fcac3568bbbbd6e42d6ebdc5af2de960edb64"} Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.246394 4835 scope.go:117] "RemoveContainer" containerID="a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.266814 4835 scope.go:117] "RemoveContainer" containerID="896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.298034 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.299683 4835 scope.go:117] "RemoveContainer" containerID="a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19" Oct 03 18:34:48 crc kubenswrapper[4835]: E1003 18:34:48.300259 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19\": container with ID starting with a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19 not found: ID does not exist" containerID="a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.300298 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19"} err="failed to get container status \"a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19\": rpc error: code = NotFound desc = could not find container \"a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19\": container with ID starting with a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19 not found: ID does not exist" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.300322 4835 scope.go:117] "RemoveContainer" containerID="896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214" Oct 03 18:34:48 crc kubenswrapper[4835]: E1003 18:34:48.300600 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214\": container with ID starting with 896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214 not found: ID does not exist" containerID="896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.300687 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214"} err="failed to get container status \"896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214\": rpc error: code = NotFound desc = could not find container \"896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214\": container with ID starting with 896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214 not found: ID does not exist" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.300831 4835 scope.go:117] "RemoveContainer" containerID="a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.302567 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19"} err="failed to get container status \"a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19\": rpc error: code = NotFound desc = could not find container \"a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19\": container with ID starting with a3638f40a4a43bc3e8f5e4d5ae06ca5ac2a6c2ce1e7924290d87460978adbb19 not found: ID does not exist" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.302604 4835 scope.go:117] "RemoveContainer" containerID="896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.303232 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214"} err="failed to get container status \"896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214\": rpc error: code = NotFound desc = could not find container \"896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214\": container with ID starting with 896cb9b3ff630c9c0165941710e5cbb0f274f9d74d3d1deca4c8eb46f6a42214 not found: ID does not exist" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.324132 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.333487 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:48 crc kubenswrapper[4835]: E1003 18:34:48.334122 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cce3c9e-9deb-46e0-bc8d-f83725925996" containerName="nova-metadata-metadata" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.334167 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cce3c9e-9deb-46e0-bc8d-f83725925996" containerName="nova-metadata-metadata" Oct 03 18:34:48 crc kubenswrapper[4835]: E1003 18:34:48.334220 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cce3c9e-9deb-46e0-bc8d-f83725925996" containerName="nova-metadata-log" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.334231 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cce3c9e-9deb-46e0-bc8d-f83725925996" containerName="nova-metadata-log" Oct 03 18:34:48 crc kubenswrapper[4835]: E1003 18:34:48.334247 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.334256 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.334511 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.334544 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cce3c9e-9deb-46e0-bc8d-f83725925996" containerName="nova-metadata-log" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.334561 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cce3c9e-9deb-46e0-bc8d-f83725925996" containerName="nova-metadata-metadata" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.335089 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8110d0e5-9e19-4306-b8aa-babe937e8d2a" containerName="watcher-decision-engine" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.337939 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.343202 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.343492 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.343833 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.401275 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.401424 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-config-data\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.401466 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8060b0d-7be6-4352-a8b2-74200dc61fdf-logs\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.401500 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttb9t\" (UniqueName: \"kubernetes.io/projected/e8060b0d-7be6-4352-a8b2-74200dc61fdf-kube-api-access-ttb9t\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.401528 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.503481 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-config-data\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.503550 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8060b0d-7be6-4352-a8b2-74200dc61fdf-logs\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.503592 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttb9t\" (UniqueName: \"kubernetes.io/projected/e8060b0d-7be6-4352-a8b2-74200dc61fdf-kube-api-access-ttb9t\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.503621 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.503661 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.504109 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8060b0d-7be6-4352-a8b2-74200dc61fdf-logs\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.508597 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.508642 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-config-data\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.515131 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.522760 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttb9t\" (UniqueName: \"kubernetes.io/projected/e8060b0d-7be6-4352-a8b2-74200dc61fdf-kube-api-access-ttb9t\") pod \"nova-metadata-0\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.665088 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:34:48 crc kubenswrapper[4835]: I1003 18:34:48.897797 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cce3c9e-9deb-46e0-bc8d-f83725925996" path="/var/lib/kubelet/pods/0cce3c9e-9deb-46e0-bc8d-f83725925996/volumes" Oct 03 18:34:49 crc kubenswrapper[4835]: I1003 18:34:49.177606 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:49 crc kubenswrapper[4835]: W1003 18:34:49.182297 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8060b0d_7be6_4352_a8b2_74200dc61fdf.slice/crio-aa2e4bcbb1081f995f664bd71058e5a6da487ab22e81e96d2beedf5f382894c2 WatchSource:0}: Error finding container aa2e4bcbb1081f995f664bd71058e5a6da487ab22e81e96d2beedf5f382894c2: Status 404 returned error can't find the container with id aa2e4bcbb1081f995f664bd71058e5a6da487ab22e81e96d2beedf5f382894c2 Oct 03 18:34:49 crc kubenswrapper[4835]: I1003 18:34:49.258851 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8060b0d-7be6-4352-a8b2-74200dc61fdf","Type":"ContainerStarted","Data":"aa2e4bcbb1081f995f664bd71058e5a6da487ab22e81e96d2beedf5f382894c2"} Oct 03 18:34:50 crc kubenswrapper[4835]: I1003 18:34:50.269432 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8060b0d-7be6-4352-a8b2-74200dc61fdf","Type":"ContainerStarted","Data":"461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee"} Oct 03 18:34:50 crc kubenswrapper[4835]: I1003 18:34:50.269784 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8060b0d-7be6-4352-a8b2-74200dc61fdf","Type":"ContainerStarted","Data":"6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0"} Oct 03 18:34:50 crc kubenswrapper[4835]: I1003 18:34:50.291860 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.291841862 podStartE2EDuration="2.291841862s" podCreationTimestamp="2025-10-03 18:34:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:34:50.287556688 +0000 UTC m=+1232.003497560" watchObservedRunningTime="2025-10-03 18:34:50.291841862 +0000 UTC m=+1232.007782734" Oct 03 18:34:51 crc kubenswrapper[4835]: I1003 18:34:51.334189 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:34:51 crc kubenswrapper[4835]: I1003 18:34:51.400304 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 18:34:51 crc kubenswrapper[4835]: I1003 18:34:51.400403 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 18:34:51 crc kubenswrapper[4835]: I1003 18:34:51.583774 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 18:34:51 crc kubenswrapper[4835]: I1003 18:34:51.583839 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 18:34:51 crc kubenswrapper[4835]: I1003 18:34:51.612216 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 18:34:51 crc kubenswrapper[4835]: I1003 18:34:51.707228 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:34:51 crc kubenswrapper[4835]: I1003 18:34:51.789056 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d779b7fdc-pqmkj"] Oct 03 18:34:51 crc kubenswrapper[4835]: I1003 18:34:51.789310 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" podUID="e203cdb5-ef30-469a-bc4e-3bae2306043d" containerName="dnsmasq-dns" containerID="cri-o://c3f706c82d7f322ce42be521f77e1ac32da0134e3f8d4db9cd0f902102354180" gracePeriod=10 Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.327271 4835 generic.go:334] "Generic (PLEG): container finished" podID="e203cdb5-ef30-469a-bc4e-3bae2306043d" containerID="c3f706c82d7f322ce42be521f77e1ac32da0134e3f8d4db9cd0f902102354180" exitCode=0 Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.327587 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" event={"ID":"e203cdb5-ef30-469a-bc4e-3bae2306043d","Type":"ContainerDied","Data":"c3f706c82d7f322ce42be521f77e1ac32da0134e3f8d4db9cd0f902102354180"} Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.340039 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce857dee-2b33-413b-8040-6012915be992" containerID="668e0906abe850415158c66fcbd94a09b8b5348b23de4d3d223e7dd30674d154" exitCode=0 Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.341289 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7jx5c" event={"ID":"ce857dee-2b33-413b-8040-6012915be992","Type":"ContainerDied","Data":"668e0906abe850415158c66fcbd94a09b8b5348b23de4d3d223e7dd30674d154"} Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.386848 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.484339 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bf4e074a-b323-4dd7-98b4-5e18e3ab6246" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.484624 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bf4e074a-b323-4dd7-98b4-5e18e3ab6246" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.207:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.485174 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.596360 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-ovsdbserver-sb\") pod \"e203cdb5-ef30-469a-bc4e-3bae2306043d\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.597147 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-ovsdbserver-nb\") pod \"e203cdb5-ef30-469a-bc4e-3bae2306043d\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.597320 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7l57\" (UniqueName: \"kubernetes.io/projected/e203cdb5-ef30-469a-bc4e-3bae2306043d-kube-api-access-g7l57\") pod \"e203cdb5-ef30-469a-bc4e-3bae2306043d\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.597358 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-dns-swift-storage-0\") pod \"e203cdb5-ef30-469a-bc4e-3bae2306043d\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.597394 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-dns-svc\") pod \"e203cdb5-ef30-469a-bc4e-3bae2306043d\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.597434 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-config\") pod \"e203cdb5-ef30-469a-bc4e-3bae2306043d\" (UID: \"e203cdb5-ef30-469a-bc4e-3bae2306043d\") " Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.605517 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e203cdb5-ef30-469a-bc4e-3bae2306043d-kube-api-access-g7l57" (OuterVolumeSpecName: "kube-api-access-g7l57") pod "e203cdb5-ef30-469a-bc4e-3bae2306043d" (UID: "e203cdb5-ef30-469a-bc4e-3bae2306043d"). InnerVolumeSpecName "kube-api-access-g7l57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.659977 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e203cdb5-ef30-469a-bc4e-3bae2306043d" (UID: "e203cdb5-ef30-469a-bc4e-3bae2306043d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.661788 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e203cdb5-ef30-469a-bc4e-3bae2306043d" (UID: "e203cdb5-ef30-469a-bc4e-3bae2306043d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.684562 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e203cdb5-ef30-469a-bc4e-3bae2306043d" (UID: "e203cdb5-ef30-469a-bc4e-3bae2306043d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.685502 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e203cdb5-ef30-469a-bc4e-3bae2306043d" (UID: "e203cdb5-ef30-469a-bc4e-3bae2306043d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.696351 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-config" (OuterVolumeSpecName: "config") pod "e203cdb5-ef30-469a-bc4e-3bae2306043d" (UID: "e203cdb5-ef30-469a-bc4e-3bae2306043d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.699717 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.699752 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.699763 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7l57\" (UniqueName: \"kubernetes.io/projected/e203cdb5-ef30-469a-bc4e-3bae2306043d-kube-api-access-g7l57\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.699774 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.699783 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:52 crc kubenswrapper[4835]: I1003 18:34:52.699793 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e203cdb5-ef30-469a-bc4e-3bae2306043d-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.350833 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" event={"ID":"e203cdb5-ef30-469a-bc4e-3bae2306043d","Type":"ContainerDied","Data":"44b6ab078a4a1f621363e1dae21ec0b86b606c41fdc4e421967d3dad539f8e72"} Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.351188 4835 scope.go:117] "RemoveContainer" containerID="c3f706c82d7f322ce42be521f77e1ac32da0134e3f8d4db9cd0f902102354180" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.350895 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.354099 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7kxdv" event={"ID":"bbaaef5c-4627-46d0-8673-1cf9767ab4d6","Type":"ContainerDied","Data":"89f2716fb9948d66f1da8169f84213d5c6f6d515d31ef7074dcd34df44aa9f39"} Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.354063 4835 generic.go:334] "Generic (PLEG): container finished" podID="bbaaef5c-4627-46d0-8673-1cf9767ab4d6" containerID="89f2716fb9948d66f1da8169f84213d5c6f6d515d31ef7074dcd34df44aa9f39" exitCode=0 Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.391486 4835 scope.go:117] "RemoveContainer" containerID="41068cf045e7c4a39ad4c034b2f988174ffbfe924d3cce62f6211a4283b6cfff" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.424942 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d779b7fdc-pqmkj"] Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.432300 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d779b7fdc-pqmkj"] Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.665762 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.665841 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.790910 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.818482 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-config-data\") pod \"ce857dee-2b33-413b-8040-6012915be992\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.818548 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-combined-ca-bundle\") pod \"ce857dee-2b33-413b-8040-6012915be992\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.818767 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-scripts\") pod \"ce857dee-2b33-413b-8040-6012915be992\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.818834 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp8sl\" (UniqueName: \"kubernetes.io/projected/ce857dee-2b33-413b-8040-6012915be992-kube-api-access-sp8sl\") pod \"ce857dee-2b33-413b-8040-6012915be992\" (UID: \"ce857dee-2b33-413b-8040-6012915be992\") " Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.826003 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce857dee-2b33-413b-8040-6012915be992-kube-api-access-sp8sl" (OuterVolumeSpecName: "kube-api-access-sp8sl") pod "ce857dee-2b33-413b-8040-6012915be992" (UID: "ce857dee-2b33-413b-8040-6012915be992"). InnerVolumeSpecName "kube-api-access-sp8sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.832273 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-scripts" (OuterVolumeSpecName: "scripts") pod "ce857dee-2b33-413b-8040-6012915be992" (UID: "ce857dee-2b33-413b-8040-6012915be992"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.849291 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce857dee-2b33-413b-8040-6012915be992" (UID: "ce857dee-2b33-413b-8040-6012915be992"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.864832 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-config-data" (OuterVolumeSpecName: "config-data") pod "ce857dee-2b33-413b-8040-6012915be992" (UID: "ce857dee-2b33-413b-8040-6012915be992"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.921317 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.921345 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp8sl\" (UniqueName: \"kubernetes.io/projected/ce857dee-2b33-413b-8040-6012915be992-kube-api-access-sp8sl\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.921356 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:53 crc kubenswrapper[4835]: I1003 18:34:53.921367 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce857dee-2b33-413b-8040-6012915be992-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.372276 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7jx5c" Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.373207 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7jx5c" event={"ID":"ce857dee-2b33-413b-8040-6012915be992","Type":"ContainerDied","Data":"f0fce01a89dfe7517f00e76998ec20f0697e98edad723c677ba1af0dfdb09c2d"} Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.373280 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0fce01a89dfe7517f00e76998ec20f0697e98edad723c677ba1af0dfdb09c2d" Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.575149 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.575476 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bf4e074a-b323-4dd7-98b4-5e18e3ab6246" containerName="nova-api-log" containerID="cri-o://2880256aef8b5fe323c35578ece5a0d31b1aabba95a3adf0e534504e3e906ed1" gracePeriod=30 Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.575645 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bf4e074a-b323-4dd7-98b4-5e18e3ab6246" containerName="nova-api-api" containerID="cri-o://b9be68c011e4c3d26aa7c6575cc2ce9441fcb60c1e7541330ef68c599ba12c90" gracePeriod=30 Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.593355 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.593624 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="46a4652d-e363-4818-9561-589c8be56373" containerName="nova-scheduler-scheduler" containerID="cri-o://268eafef82b4556e3766dc232b2119cd77f3d227316f2b043a1750902060d4bf" gracePeriod=30 Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.614850 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.789457 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.838366 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-scripts\") pod \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.838512 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-combined-ca-bundle\") pod \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.838626 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbp7q\" (UniqueName: \"kubernetes.io/projected/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-kube-api-access-dbp7q\") pod \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.838678 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-config-data\") pod \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\" (UID: \"bbaaef5c-4627-46d0-8673-1cf9767ab4d6\") " Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.843275 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-kube-api-access-dbp7q" (OuterVolumeSpecName: "kube-api-access-dbp7q") pod "bbaaef5c-4627-46d0-8673-1cf9767ab4d6" (UID: "bbaaef5c-4627-46d0-8673-1cf9767ab4d6"). InnerVolumeSpecName "kube-api-access-dbp7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.846362 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-scripts" (OuterVolumeSpecName: "scripts") pod "bbaaef5c-4627-46d0-8673-1cf9767ab4d6" (UID: "bbaaef5c-4627-46d0-8673-1cf9767ab4d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.868110 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbaaef5c-4627-46d0-8673-1cf9767ab4d6" (UID: "bbaaef5c-4627-46d0-8673-1cf9767ab4d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.870168 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-config-data" (OuterVolumeSpecName: "config-data") pod "bbaaef5c-4627-46d0-8673-1cf9767ab4d6" (UID: "bbaaef5c-4627-46d0-8673-1cf9767ab4d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.887164 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e203cdb5-ef30-469a-bc4e-3bae2306043d" path="/var/lib/kubelet/pods/e203cdb5-ef30-469a-bc4e-3bae2306043d/volumes" Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.941621 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.941666 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbp7q\" (UniqueName: \"kubernetes.io/projected/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-kube-api-access-dbp7q\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.941687 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:54 crc kubenswrapper[4835]: I1003 18:34:54.941695 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaaef5c-4627-46d0-8673-1cf9767ab4d6-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.394762 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7kxdv" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.394759 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7kxdv" event={"ID":"bbaaef5c-4627-46d0-8673-1cf9767ab4d6","Type":"ContainerDied","Data":"97b02b66e56ab44966d7bc8eef3653cd11d49f2f51d926373b17ebe59fbf8506"} Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.394915 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97b02b66e56ab44966d7bc8eef3653cd11d49f2f51d926373b17ebe59fbf8506" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.406790 4835 generic.go:334] "Generic (PLEG): container finished" podID="bf4e074a-b323-4dd7-98b4-5e18e3ab6246" containerID="2880256aef8b5fe323c35578ece5a0d31b1aabba95a3adf0e534504e3e906ed1" exitCode=143 Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.406907 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf4e074a-b323-4dd7-98b4-5e18e3ab6246","Type":"ContainerDied","Data":"2880256aef8b5fe323c35578ece5a0d31b1aabba95a3adf0e534504e3e906ed1"} Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.406985 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e8060b0d-7be6-4352-a8b2-74200dc61fdf" containerName="nova-metadata-log" containerID="cri-o://6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0" gracePeriod=30 Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.407052 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e8060b0d-7be6-4352-a8b2-74200dc61fdf" containerName="nova-metadata-metadata" containerID="cri-o://461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee" gracePeriod=30 Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.522061 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 18:34:55 crc kubenswrapper[4835]: E1003 18:34:55.522523 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbaaef5c-4627-46d0-8673-1cf9767ab4d6" containerName="nova-cell1-conductor-db-sync" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.522539 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbaaef5c-4627-46d0-8673-1cf9767ab4d6" containerName="nova-cell1-conductor-db-sync" Oct 03 18:34:55 crc kubenswrapper[4835]: E1003 18:34:55.522557 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce857dee-2b33-413b-8040-6012915be992" containerName="nova-manage" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.522564 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce857dee-2b33-413b-8040-6012915be992" containerName="nova-manage" Oct 03 18:34:55 crc kubenswrapper[4835]: E1003 18:34:55.522597 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e203cdb5-ef30-469a-bc4e-3bae2306043d" containerName="init" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.522605 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e203cdb5-ef30-469a-bc4e-3bae2306043d" containerName="init" Oct 03 18:34:55 crc kubenswrapper[4835]: E1003 18:34:55.522618 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e203cdb5-ef30-469a-bc4e-3bae2306043d" containerName="dnsmasq-dns" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.522624 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e203cdb5-ef30-469a-bc4e-3bae2306043d" containerName="dnsmasq-dns" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.522804 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e203cdb5-ef30-469a-bc4e-3bae2306043d" containerName="dnsmasq-dns" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.522829 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce857dee-2b33-413b-8040-6012915be992" containerName="nova-manage" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.522842 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbaaef5c-4627-46d0-8673-1cf9767ab4d6" containerName="nova-cell1-conductor-db-sync" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.523819 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.529268 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.534947 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.543101 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.551237 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56348f36-94bd-43e7-a6ea-d55206a5ccc3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"56348f36-94bd-43e7-a6ea-d55206a5ccc3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.551295 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw9dl\" (UniqueName: \"kubernetes.io/projected/56348f36-94bd-43e7-a6ea-d55206a5ccc3-kube-api-access-kw9dl\") pod \"nova-cell1-conductor-0\" (UID: \"56348f36-94bd-43e7-a6ea-d55206a5ccc3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.551325 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56348f36-94bd-43e7-a6ea-d55206a5ccc3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"56348f36-94bd-43e7-a6ea-d55206a5ccc3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.589472 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.652218 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56348f36-94bd-43e7-a6ea-d55206a5ccc3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"56348f36-94bd-43e7-a6ea-d55206a5ccc3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.652276 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw9dl\" (UniqueName: \"kubernetes.io/projected/56348f36-94bd-43e7-a6ea-d55206a5ccc3-kube-api-access-kw9dl\") pod \"nova-cell1-conductor-0\" (UID: \"56348f36-94bd-43e7-a6ea-d55206a5ccc3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.652316 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56348f36-94bd-43e7-a6ea-d55206a5ccc3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"56348f36-94bd-43e7-a6ea-d55206a5ccc3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.656945 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56348f36-94bd-43e7-a6ea-d55206a5ccc3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"56348f36-94bd-43e7-a6ea-d55206a5ccc3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.668017 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56348f36-94bd-43e7-a6ea-d55206a5ccc3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"56348f36-94bd-43e7-a6ea-d55206a5ccc3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.674531 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw9dl\" (UniqueName: \"kubernetes.io/projected/56348f36-94bd-43e7-a6ea-d55206a5ccc3-kube-api-access-kw9dl\") pod \"nova-cell1-conductor-0\" (UID: \"56348f36-94bd-43e7-a6ea-d55206a5ccc3\") " pod="openstack/nova-cell1-conductor-0" Oct 03 18:34:55 crc kubenswrapper[4835]: I1003 18:34:55.888178 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.035013 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.059743 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-config-data\") pod \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.059804 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttb9t\" (UniqueName: \"kubernetes.io/projected/e8060b0d-7be6-4352-a8b2-74200dc61fdf-kube-api-access-ttb9t\") pod \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.059920 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-nova-metadata-tls-certs\") pod \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.060154 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8060b0d-7be6-4352-a8b2-74200dc61fdf-logs\") pod \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.060214 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-combined-ca-bundle\") pod \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\" (UID: \"e8060b0d-7be6-4352-a8b2-74200dc61fdf\") " Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.061155 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8060b0d-7be6-4352-a8b2-74200dc61fdf-logs" (OuterVolumeSpecName: "logs") pod "e8060b0d-7be6-4352-a8b2-74200dc61fdf" (UID: "e8060b0d-7be6-4352-a8b2-74200dc61fdf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.061600 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8060b0d-7be6-4352-a8b2-74200dc61fdf-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.065553 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8060b0d-7be6-4352-a8b2-74200dc61fdf-kube-api-access-ttb9t" (OuterVolumeSpecName: "kube-api-access-ttb9t") pod "e8060b0d-7be6-4352-a8b2-74200dc61fdf" (UID: "e8060b0d-7be6-4352-a8b2-74200dc61fdf"). InnerVolumeSpecName "kube-api-access-ttb9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.103346 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8060b0d-7be6-4352-a8b2-74200dc61fdf" (UID: "e8060b0d-7be6-4352-a8b2-74200dc61fdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.104344 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-config-data" (OuterVolumeSpecName: "config-data") pod "e8060b0d-7be6-4352-a8b2-74200dc61fdf" (UID: "e8060b0d-7be6-4352-a8b2-74200dc61fdf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.123457 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e8060b0d-7be6-4352-a8b2-74200dc61fdf" (UID: "e8060b0d-7be6-4352-a8b2-74200dc61fdf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.163842 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.163881 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.163890 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttb9t\" (UniqueName: \"kubernetes.io/projected/e8060b0d-7be6-4352-a8b2-74200dc61fdf-kube-api-access-ttb9t\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.163901 4835 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8060b0d-7be6-4352-a8b2-74200dc61fdf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.378802 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 18:34:56 crc kubenswrapper[4835]: W1003 18:34:56.379079 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56348f36_94bd_43e7_a6ea_d55206a5ccc3.slice/crio-3a795ff25af0e5c4fe6bed510613ec5121f5dbb5dffcb932a6dff6a46c875f69 WatchSource:0}: Error finding container 3a795ff25af0e5c4fe6bed510613ec5121f5dbb5dffcb932a6dff6a46c875f69: Status 404 returned error can't find the container with id 3a795ff25af0e5c4fe6bed510613ec5121f5dbb5dffcb932a6dff6a46c875f69 Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.418044 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"56348f36-94bd-43e7-a6ea-d55206a5ccc3","Type":"ContainerStarted","Data":"3a795ff25af0e5c4fe6bed510613ec5121f5dbb5dffcb932a6dff6a46c875f69"} Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.420233 4835 generic.go:334] "Generic (PLEG): container finished" podID="e8060b0d-7be6-4352-a8b2-74200dc61fdf" containerID="461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee" exitCode=0 Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.420287 4835 generic.go:334] "Generic (PLEG): container finished" podID="e8060b0d-7be6-4352-a8b2-74200dc61fdf" containerID="6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0" exitCode=143 Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.420314 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.420303 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8060b0d-7be6-4352-a8b2-74200dc61fdf","Type":"ContainerDied","Data":"461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee"} Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.420604 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8060b0d-7be6-4352-a8b2-74200dc61fdf","Type":"ContainerDied","Data":"6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0"} Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.420621 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e8060b0d-7be6-4352-a8b2-74200dc61fdf","Type":"ContainerDied","Data":"aa2e4bcbb1081f995f664bd71058e5a6da487ab22e81e96d2beedf5f382894c2"} Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.420640 4835 scope.go:117] "RemoveContainer" containerID="461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.420825 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.452917 4835 scope.go:117] "RemoveContainer" containerID="6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.453026 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.466894 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.469003 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.479579 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:56 crc kubenswrapper[4835]: E1003 18:34:56.479989 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8060b0d-7be6-4352-a8b2-74200dc61fdf" containerName="nova-metadata-log" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.480004 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8060b0d-7be6-4352-a8b2-74200dc61fdf" containerName="nova-metadata-log" Oct 03 18:34:56 crc kubenswrapper[4835]: E1003 18:34:56.480028 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8060b0d-7be6-4352-a8b2-74200dc61fdf" containerName="nova-metadata-metadata" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.480034 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8060b0d-7be6-4352-a8b2-74200dc61fdf" containerName="nova-metadata-metadata" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.480244 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8060b0d-7be6-4352-a8b2-74200dc61fdf" containerName="nova-metadata-log" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.480273 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8060b0d-7be6-4352-a8b2-74200dc61fdf" containerName="nova-metadata-metadata" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.481318 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.483896 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.484196 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.487367 4835 scope.go:117] "RemoveContainer" containerID="461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee" Oct 03 18:34:56 crc kubenswrapper[4835]: E1003 18:34:56.487839 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee\": container with ID starting with 461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee not found: ID does not exist" containerID="461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.487876 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee"} err="failed to get container status \"461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee\": rpc error: code = NotFound desc = could not find container \"461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee\": container with ID starting with 461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee not found: ID does not exist" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.487902 4835 scope.go:117] "RemoveContainer" containerID="6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0" Oct 03 18:34:56 crc kubenswrapper[4835]: E1003 18:34:56.488291 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0\": container with ID starting with 6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0 not found: ID does not exist" containerID="6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.488316 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0"} err="failed to get container status \"6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0\": rpc error: code = NotFound desc = could not find container \"6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0\": container with ID starting with 6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0 not found: ID does not exist" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.488332 4835 scope.go:117] "RemoveContainer" containerID="461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.488674 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee"} err="failed to get container status \"461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee\": rpc error: code = NotFound desc = could not find container \"461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee\": container with ID starting with 461ebdfb43edb3833728b77396ff613b8b971885592c1f55792f9ef80cad64ee not found: ID does not exist" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.488691 4835 scope.go:117] "RemoveContainer" containerID="6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.489083 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0"} err="failed to get container status \"6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0\": rpc error: code = NotFound desc = could not find container \"6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0\": container with ID starting with 6435522cb530f66e22a87995e9e33d9c5800c87329a0c09956a80c1ebd5e7bb0 not found: ID does not exist" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.492241 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.575648 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/195a095e-91be-47e7-97de-a0dc42c301f7-logs\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.575993 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.576039 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-config-data\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.576131 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv4lf\" (UniqueName: \"kubernetes.io/projected/195a095e-91be-47e7-97de-a0dc42c301f7-kube-api-access-sv4lf\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.576206 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: E1003 18:34:56.587113 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="268eafef82b4556e3766dc232b2119cd77f3d227316f2b043a1750902060d4bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 18:34:56 crc kubenswrapper[4835]: E1003 18:34:56.590954 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="268eafef82b4556e3766dc232b2119cd77f3d227316f2b043a1750902060d4bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 18:34:56 crc kubenswrapper[4835]: E1003 18:34:56.592168 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="268eafef82b4556e3766dc232b2119cd77f3d227316f2b043a1750902060d4bf" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 18:34:56 crc kubenswrapper[4835]: E1003 18:34:56.592212 4835 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="46a4652d-e363-4818-9561-589c8be56373" containerName="nova-scheduler-scheduler" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.677358 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv4lf\" (UniqueName: \"kubernetes.io/projected/195a095e-91be-47e7-97de-a0dc42c301f7-kube-api-access-sv4lf\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.677444 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.677486 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/195a095e-91be-47e7-97de-a0dc42c301f7-logs\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.677531 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.677559 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-config-data\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.678091 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/195a095e-91be-47e7-97de-a0dc42c301f7-logs\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.683298 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.685138 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.692089 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-config-data\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.695455 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv4lf\" (UniqueName: \"kubernetes.io/projected/195a095e-91be-47e7-97de-a0dc42c301f7-kube-api-access-sv4lf\") pod \"nova-metadata-0\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.904028 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8060b0d-7be6-4352-a8b2-74200dc61fdf" path="/var/lib/kubelet/pods/e8060b0d-7be6-4352-a8b2-74200dc61fdf/volumes" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.951863 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:34:56 crc kubenswrapper[4835]: I1003 18:34:56.981517 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.086688 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-logs\") pod \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.086859 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-config-data\") pod \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.086910 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs6kb\" (UniqueName: \"kubernetes.io/projected/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-kube-api-access-rs6kb\") pod \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.086927 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-combined-ca-bundle\") pod \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\" (UID: \"bf4e074a-b323-4dd7-98b4-5e18e3ab6246\") " Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.087156 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-logs" (OuterVolumeSpecName: "logs") pod "bf4e074a-b323-4dd7-98b4-5e18e3ab6246" (UID: "bf4e074a-b323-4dd7-98b4-5e18e3ab6246"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.087769 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.093428 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-kube-api-access-rs6kb" (OuterVolumeSpecName: "kube-api-access-rs6kb") pod "bf4e074a-b323-4dd7-98b4-5e18e3ab6246" (UID: "bf4e074a-b323-4dd7-98b4-5e18e3ab6246"). InnerVolumeSpecName "kube-api-access-rs6kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.122323 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf4e074a-b323-4dd7-98b4-5e18e3ab6246" (UID: "bf4e074a-b323-4dd7-98b4-5e18e3ab6246"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.128122 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-config-data" (OuterVolumeSpecName: "config-data") pod "bf4e074a-b323-4dd7-98b4-5e18e3ab6246" (UID: "bf4e074a-b323-4dd7-98b4-5e18e3ab6246"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.188882 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.188915 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs6kb\" (UniqueName: \"kubernetes.io/projected/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-kube-api-access-rs6kb\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.188926 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf4e074a-b323-4dd7-98b4-5e18e3ab6246-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.283656 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.332285 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d779b7fdc-pqmkj" podUID="e203cdb5-ef30-469a-bc4e-3bae2306043d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.184:5353: i/o timeout" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.410905 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.431314 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"56348f36-94bd-43e7-a6ea-d55206a5ccc3","Type":"ContainerStarted","Data":"56f3f8e03d0b0cf8a7f8222372f5e0ee9af735d77f268b15db85b5408c68f2d2"} Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.431475 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.434766 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"195a095e-91be-47e7-97de-a0dc42c301f7","Type":"ContainerStarted","Data":"60bac3981298c25d9a431fdb784676c64018342662e66ca7ef80f58306b1d9a0"} Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.438056 4835 generic.go:334] "Generic (PLEG): container finished" podID="bf4e074a-b323-4dd7-98b4-5e18e3ab6246" containerID="b9be68c011e4c3d26aa7c6575cc2ce9441fcb60c1e7541330ef68c599ba12c90" exitCode=0 Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.438922 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.446407 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf4e074a-b323-4dd7-98b4-5e18e3ab6246","Type":"ContainerDied","Data":"b9be68c011e4c3d26aa7c6575cc2ce9441fcb60c1e7541330ef68c599ba12c90"} Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.446479 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf4e074a-b323-4dd7-98b4-5e18e3ab6246","Type":"ContainerDied","Data":"ec0ae7ac8dfaf3c83e8fcd83acdfd2a943007d6248da7b978cc74d7f2492eb77"} Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.446632 4835 scope.go:117] "RemoveContainer" containerID="b9be68c011e4c3d26aa7c6575cc2ce9441fcb60c1e7541330ef68c599ba12c90" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.451890 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.451873712 podStartE2EDuration="2.451873712s" podCreationTimestamp="2025-10-03 18:34:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:34:57.447153367 +0000 UTC m=+1239.163094269" watchObservedRunningTime="2025-10-03 18:34:57.451873712 +0000 UTC m=+1239.167814584" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.477221 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.477286 4835 scope.go:117] "RemoveContainer" containerID="2880256aef8b5fe323c35578ece5a0d31b1aabba95a3adf0e534504e3e906ed1" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.487049 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.504681 4835 scope.go:117] "RemoveContainer" containerID="b9be68c011e4c3d26aa7c6575cc2ce9441fcb60c1e7541330ef68c599ba12c90" Oct 03 18:34:57 crc kubenswrapper[4835]: E1003 18:34:57.507390 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9be68c011e4c3d26aa7c6575cc2ce9441fcb60c1e7541330ef68c599ba12c90\": container with ID starting with b9be68c011e4c3d26aa7c6575cc2ce9441fcb60c1e7541330ef68c599ba12c90 not found: ID does not exist" containerID="b9be68c011e4c3d26aa7c6575cc2ce9441fcb60c1e7541330ef68c599ba12c90" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.507433 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9be68c011e4c3d26aa7c6575cc2ce9441fcb60c1e7541330ef68c599ba12c90"} err="failed to get container status \"b9be68c011e4c3d26aa7c6575cc2ce9441fcb60c1e7541330ef68c599ba12c90\": rpc error: code = NotFound desc = could not find container \"b9be68c011e4c3d26aa7c6575cc2ce9441fcb60c1e7541330ef68c599ba12c90\": container with ID starting with b9be68c011e4c3d26aa7c6575cc2ce9441fcb60c1e7541330ef68c599ba12c90 not found: ID does not exist" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.507460 4835 scope.go:117] "RemoveContainer" containerID="2880256aef8b5fe323c35578ece5a0d31b1aabba95a3adf0e534504e3e906ed1" Oct 03 18:34:57 crc kubenswrapper[4835]: E1003 18:34:57.507923 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2880256aef8b5fe323c35578ece5a0d31b1aabba95a3adf0e534504e3e906ed1\": container with ID starting with 2880256aef8b5fe323c35578ece5a0d31b1aabba95a3adf0e534504e3e906ed1 not found: ID does not exist" containerID="2880256aef8b5fe323c35578ece5a0d31b1aabba95a3adf0e534504e3e906ed1" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.507967 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2880256aef8b5fe323c35578ece5a0d31b1aabba95a3adf0e534504e3e906ed1"} err="failed to get container status \"2880256aef8b5fe323c35578ece5a0d31b1aabba95a3adf0e534504e3e906ed1\": rpc error: code = NotFound desc = could not find container \"2880256aef8b5fe323c35578ece5a0d31b1aabba95a3adf0e534504e3e906ed1\": container with ID starting with 2880256aef8b5fe323c35578ece5a0d31b1aabba95a3adf0e534504e3e906ed1 not found: ID does not exist" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.522272 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 18:34:57 crc kubenswrapper[4835]: E1003 18:34:57.522721 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4e074a-b323-4dd7-98b4-5e18e3ab6246" containerName="nova-api-api" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.522740 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4e074a-b323-4dd7-98b4-5e18e3ab6246" containerName="nova-api-api" Oct 03 18:34:57 crc kubenswrapper[4835]: E1003 18:34:57.522778 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4e074a-b323-4dd7-98b4-5e18e3ab6246" containerName="nova-api-log" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.522784 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4e074a-b323-4dd7-98b4-5e18e3ab6246" containerName="nova-api-log" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.523020 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf4e074a-b323-4dd7-98b4-5e18e3ab6246" containerName="nova-api-api" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.523039 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf4e074a-b323-4dd7-98b4-5e18e3ab6246" containerName="nova-api-log" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.524159 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.526679 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.535965 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.607532 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9dba733-1e73-4ed3-a029-53ff385f53a5-config-data\") pod \"nova-api-0\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.607658 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj57j\" (UniqueName: \"kubernetes.io/projected/c9dba733-1e73-4ed3-a029-53ff385f53a5-kube-api-access-jj57j\") pod \"nova-api-0\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.607906 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9dba733-1e73-4ed3-a029-53ff385f53a5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.608026 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9dba733-1e73-4ed3-a029-53ff385f53a5-logs\") pod \"nova-api-0\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.709507 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9dba733-1e73-4ed3-a029-53ff385f53a5-config-data\") pod \"nova-api-0\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.709838 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj57j\" (UniqueName: \"kubernetes.io/projected/c9dba733-1e73-4ed3-a029-53ff385f53a5-kube-api-access-jj57j\") pod \"nova-api-0\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.709915 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9dba733-1e73-4ed3-a029-53ff385f53a5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.709961 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9dba733-1e73-4ed3-a029-53ff385f53a5-logs\") pod \"nova-api-0\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.710432 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9dba733-1e73-4ed3-a029-53ff385f53a5-logs\") pod \"nova-api-0\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.714573 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9dba733-1e73-4ed3-a029-53ff385f53a5-config-data\") pod \"nova-api-0\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.715355 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9dba733-1e73-4ed3-a029-53ff385f53a5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.733387 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj57j\" (UniqueName: \"kubernetes.io/projected/c9dba733-1e73-4ed3-a029-53ff385f53a5-kube-api-access-jj57j\") pod \"nova-api-0\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " pod="openstack/nova-api-0" Oct 03 18:34:57 crc kubenswrapper[4835]: I1003 18:34:57.927396 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:34:58 crc kubenswrapper[4835]: I1003 18:34:58.392853 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:34:58 crc kubenswrapper[4835]: W1003 18:34:58.397457 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9dba733_1e73_4ed3_a029_53ff385f53a5.slice/crio-0601e3b1ed5946b1e57cc1dc1c2a86d65be50470f36818ea9eba2cce57f87957 WatchSource:0}: Error finding container 0601e3b1ed5946b1e57cc1dc1c2a86d65be50470f36818ea9eba2cce57f87957: Status 404 returned error can't find the container with id 0601e3b1ed5946b1e57cc1dc1c2a86d65be50470f36818ea9eba2cce57f87957 Oct 03 18:34:58 crc kubenswrapper[4835]: I1003 18:34:58.461622 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"195a095e-91be-47e7-97de-a0dc42c301f7","Type":"ContainerStarted","Data":"aff09693aa259e7dad1f626c7160906cb2b6e3ccdca0cb010c99eba4ca6e5a44"} Oct 03 18:34:58 crc kubenswrapper[4835]: I1003 18:34:58.461665 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"195a095e-91be-47e7-97de-a0dc42c301f7","Type":"ContainerStarted","Data":"e3ca1784094827ceb926c99764374ee5fdcba194fb4aea72a1439281ac9157d4"} Oct 03 18:34:58 crc kubenswrapper[4835]: I1003 18:34:58.464941 4835 generic.go:334] "Generic (PLEG): container finished" podID="46a4652d-e363-4818-9561-589c8be56373" containerID="268eafef82b4556e3766dc232b2119cd77f3d227316f2b043a1750902060d4bf" exitCode=0 Oct 03 18:34:58 crc kubenswrapper[4835]: I1003 18:34:58.464998 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46a4652d-e363-4818-9561-589c8be56373","Type":"ContainerDied","Data":"268eafef82b4556e3766dc232b2119cd77f3d227316f2b043a1750902060d4bf"} Oct 03 18:34:58 crc kubenswrapper[4835]: I1003 18:34:58.467198 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9dba733-1e73-4ed3-a029-53ff385f53a5","Type":"ContainerStarted","Data":"0601e3b1ed5946b1e57cc1dc1c2a86d65be50470f36818ea9eba2cce57f87957"} Oct 03 18:34:58 crc kubenswrapper[4835]: I1003 18:34:58.506626 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.50659659 podStartE2EDuration="2.50659659s" podCreationTimestamp="2025-10-03 18:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:34:58.498443581 +0000 UTC m=+1240.214384453" watchObservedRunningTime="2025-10-03 18:34:58.50659659 +0000 UTC m=+1240.222537462" Oct 03 18:34:58 crc kubenswrapper[4835]: I1003 18:34:58.906260 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf4e074a-b323-4dd7-98b4-5e18e3ab6246" path="/var/lib/kubelet/pods/bf4e074a-b323-4dd7-98b4-5e18e3ab6246/volumes" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.082184 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.164755 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a4652d-e363-4818-9561-589c8be56373-config-data\") pod \"46a4652d-e363-4818-9561-589c8be56373\" (UID: \"46a4652d-e363-4818-9561-589c8be56373\") " Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.164813 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6fx2\" (UniqueName: \"kubernetes.io/projected/46a4652d-e363-4818-9561-589c8be56373-kube-api-access-k6fx2\") pod \"46a4652d-e363-4818-9561-589c8be56373\" (UID: \"46a4652d-e363-4818-9561-589c8be56373\") " Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.165026 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a4652d-e363-4818-9561-589c8be56373-combined-ca-bundle\") pod \"46a4652d-e363-4818-9561-589c8be56373\" (UID: \"46a4652d-e363-4818-9561-589c8be56373\") " Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.171272 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46a4652d-e363-4818-9561-589c8be56373-kube-api-access-k6fx2" (OuterVolumeSpecName: "kube-api-access-k6fx2") pod "46a4652d-e363-4818-9561-589c8be56373" (UID: "46a4652d-e363-4818-9561-589c8be56373"). InnerVolumeSpecName "kube-api-access-k6fx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.195363 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46a4652d-e363-4818-9561-589c8be56373-config-data" (OuterVolumeSpecName: "config-data") pod "46a4652d-e363-4818-9561-589c8be56373" (UID: "46a4652d-e363-4818-9561-589c8be56373"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.207090 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46a4652d-e363-4818-9561-589c8be56373-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46a4652d-e363-4818-9561-589c8be56373" (UID: "46a4652d-e363-4818-9561-589c8be56373"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.267819 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a4652d-e363-4818-9561-589c8be56373-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.267850 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a4652d-e363-4818-9561-589c8be56373-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.267875 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6fx2\" (UniqueName: \"kubernetes.io/projected/46a4652d-e363-4818-9561-589c8be56373-kube-api-access-k6fx2\") on node \"crc\" DevicePath \"\"" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.482658 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.482678 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46a4652d-e363-4818-9561-589c8be56373","Type":"ContainerDied","Data":"7f034a214f2832d370b02b3d1c51e8f5586d83c9a05f4dcff110d2f632178582"} Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.482778 4835 scope.go:117] "RemoveContainer" containerID="268eafef82b4556e3766dc232b2119cd77f3d227316f2b043a1750902060d4bf" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.486321 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9dba733-1e73-4ed3-a029-53ff385f53a5","Type":"ContainerStarted","Data":"3e91860e255938939441bf01f1505e8ed1aec50033d1a72f2c184ada6d06b3ef"} Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.486359 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9dba733-1e73-4ed3-a029-53ff385f53a5","Type":"ContainerStarted","Data":"b40bdcf4296a048744855fbf2661efc8ff2b4a7771b713fc5528404c5a2d2bf3"} Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.523624 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.52360557 podStartE2EDuration="2.52360557s" podCreationTimestamp="2025-10-03 18:34:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:34:59.50585994 +0000 UTC m=+1241.221800812" watchObservedRunningTime="2025-10-03 18:34:59.52360557 +0000 UTC m=+1241.239546442" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.537986 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.563201 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.576184 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:34:59 crc kubenswrapper[4835]: E1003 18:34:59.576680 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a4652d-e363-4818-9561-589c8be56373" containerName="nova-scheduler-scheduler" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.576699 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a4652d-e363-4818-9561-589c8be56373" containerName="nova-scheduler-scheduler" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.576893 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="46a4652d-e363-4818-9561-589c8be56373" containerName="nova-scheduler-scheduler" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.577741 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.580662 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.595115 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.679255 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949e50ea-2b76-4506-8fbb-fb58efd9f020-config-data\") pod \"nova-scheduler-0\" (UID: \"949e50ea-2b76-4506-8fbb-fb58efd9f020\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.679395 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg4zc\" (UniqueName: \"kubernetes.io/projected/949e50ea-2b76-4506-8fbb-fb58efd9f020-kube-api-access-fg4zc\") pod \"nova-scheduler-0\" (UID: \"949e50ea-2b76-4506-8fbb-fb58efd9f020\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.679518 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949e50ea-2b76-4506-8fbb-fb58efd9f020-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"949e50ea-2b76-4506-8fbb-fb58efd9f020\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.781132 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4zc\" (UniqueName: \"kubernetes.io/projected/949e50ea-2b76-4506-8fbb-fb58efd9f020-kube-api-access-fg4zc\") pod \"nova-scheduler-0\" (UID: \"949e50ea-2b76-4506-8fbb-fb58efd9f020\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.781465 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949e50ea-2b76-4506-8fbb-fb58efd9f020-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"949e50ea-2b76-4506-8fbb-fb58efd9f020\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.781518 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949e50ea-2b76-4506-8fbb-fb58efd9f020-config-data\") pod \"nova-scheduler-0\" (UID: \"949e50ea-2b76-4506-8fbb-fb58efd9f020\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.785128 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949e50ea-2b76-4506-8fbb-fb58efd9f020-config-data\") pod \"nova-scheduler-0\" (UID: \"949e50ea-2b76-4506-8fbb-fb58efd9f020\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.785342 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949e50ea-2b76-4506-8fbb-fb58efd9f020-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"949e50ea-2b76-4506-8fbb-fb58efd9f020\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.796100 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg4zc\" (UniqueName: \"kubernetes.io/projected/949e50ea-2b76-4506-8fbb-fb58efd9f020-kube-api-access-fg4zc\") pod \"nova-scheduler-0\" (UID: \"949e50ea-2b76-4506-8fbb-fb58efd9f020\") " pod="openstack/nova-scheduler-0" Oct 03 18:34:59 crc kubenswrapper[4835]: I1003 18:34:59.896377 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 18:35:00 crc kubenswrapper[4835]: I1003 18:35:00.415799 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:35:00 crc kubenswrapper[4835]: W1003 18:35:00.425426 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod949e50ea_2b76_4506_8fbb_fb58efd9f020.slice/crio-a5c8489082aae1921379274e630bfb7f04de8bc1cc5a89a61b6a41c0b6199b30 WatchSource:0}: Error finding container a5c8489082aae1921379274e630bfb7f04de8bc1cc5a89a61b6a41c0b6199b30: Status 404 returned error can't find the container with id a5c8489082aae1921379274e630bfb7f04de8bc1cc5a89a61b6a41c0b6199b30 Oct 03 18:35:00 crc kubenswrapper[4835]: I1003 18:35:00.497291 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"949e50ea-2b76-4506-8fbb-fb58efd9f020","Type":"ContainerStarted","Data":"a5c8489082aae1921379274e630bfb7f04de8bc1cc5a89a61b6a41c0b6199b30"} Oct 03 18:35:00 crc kubenswrapper[4835]: I1003 18:35:00.889562 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46a4652d-e363-4818-9561-589c8be56373" path="/var/lib/kubelet/pods/46a4652d-e363-4818-9561-589c8be56373/volumes" Oct 03 18:35:01 crc kubenswrapper[4835]: I1003 18:35:01.516365 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"949e50ea-2b76-4506-8fbb-fb58efd9f020","Type":"ContainerStarted","Data":"f077c3054fece741a7b9f5660d2d8290d0de7ad4704eae303ac506c3628747fc"} Oct 03 18:35:01 crc kubenswrapper[4835]: I1003 18:35:01.534491 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 18:35:01 crc kubenswrapper[4835]: I1003 18:35:01.534709 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a53486c1-995b-46d5-84a0-f74f2ec2b5ba" containerName="kube-state-metrics" containerID="cri-o://13d37a7aa6826d5448b04ad3aaf523ded57f96bb2c465815ee8aa2ea884063e2" gracePeriod=30 Oct 03 18:35:01 crc kubenswrapper[4835]: I1003 18:35:01.536079 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.536056596 podStartE2EDuration="2.536056596s" podCreationTimestamp="2025-10-03 18:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:35:01.531735771 +0000 UTC m=+1243.247676643" watchObservedRunningTime="2025-10-03 18:35:01.536056596 +0000 UTC m=+1243.251997468" Oct 03 18:35:01 crc kubenswrapper[4835]: I1003 18:35:01.954284 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 18:35:01 crc kubenswrapper[4835]: I1003 18:35:01.954337 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.027575 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.139616 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lcrg\" (UniqueName: \"kubernetes.io/projected/a53486c1-995b-46d5-84a0-f74f2ec2b5ba-kube-api-access-7lcrg\") pod \"a53486c1-995b-46d5-84a0-f74f2ec2b5ba\" (UID: \"a53486c1-995b-46d5-84a0-f74f2ec2b5ba\") " Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.148683 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a53486c1-995b-46d5-84a0-f74f2ec2b5ba-kube-api-access-7lcrg" (OuterVolumeSpecName: "kube-api-access-7lcrg") pod "a53486c1-995b-46d5-84a0-f74f2ec2b5ba" (UID: "a53486c1-995b-46d5-84a0-f74f2ec2b5ba"). InnerVolumeSpecName "kube-api-access-7lcrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.241816 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lcrg\" (UniqueName: \"kubernetes.io/projected/a53486c1-995b-46d5-84a0-f74f2ec2b5ba-kube-api-access-7lcrg\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.530711 4835 generic.go:334] "Generic (PLEG): container finished" podID="a53486c1-995b-46d5-84a0-f74f2ec2b5ba" containerID="13d37a7aa6826d5448b04ad3aaf523ded57f96bb2c465815ee8aa2ea884063e2" exitCode=2 Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.530853 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.530840 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a53486c1-995b-46d5-84a0-f74f2ec2b5ba","Type":"ContainerDied","Data":"13d37a7aa6826d5448b04ad3aaf523ded57f96bb2c465815ee8aa2ea884063e2"} Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.531045 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a53486c1-995b-46d5-84a0-f74f2ec2b5ba","Type":"ContainerDied","Data":"64418d5ba5174c55f5beeadfef4e07dba66dd856fd8c251e15fb9329560b6575"} Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.531100 4835 scope.go:117] "RemoveContainer" containerID="13d37a7aa6826d5448b04ad3aaf523ded57f96bb2c465815ee8aa2ea884063e2" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.557691 4835 scope.go:117] "RemoveContainer" containerID="13d37a7aa6826d5448b04ad3aaf523ded57f96bb2c465815ee8aa2ea884063e2" Oct 03 18:35:02 crc kubenswrapper[4835]: E1003 18:35:02.558174 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d37a7aa6826d5448b04ad3aaf523ded57f96bb2c465815ee8aa2ea884063e2\": container with ID starting with 13d37a7aa6826d5448b04ad3aaf523ded57f96bb2c465815ee8aa2ea884063e2 not found: ID does not exist" containerID="13d37a7aa6826d5448b04ad3aaf523ded57f96bb2c465815ee8aa2ea884063e2" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.558202 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d37a7aa6826d5448b04ad3aaf523ded57f96bb2c465815ee8aa2ea884063e2"} err="failed to get container status \"13d37a7aa6826d5448b04ad3aaf523ded57f96bb2c465815ee8aa2ea884063e2\": rpc error: code = NotFound desc = could not find container \"13d37a7aa6826d5448b04ad3aaf523ded57f96bb2c465815ee8aa2ea884063e2\": container with ID starting with 13d37a7aa6826d5448b04ad3aaf523ded57f96bb2c465815ee8aa2ea884063e2 not found: ID does not exist" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.576141 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.595952 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.607411 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 18:35:02 crc kubenswrapper[4835]: E1003 18:35:02.621395 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a53486c1-995b-46d5-84a0-f74f2ec2b5ba" containerName="kube-state-metrics" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.621434 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a53486c1-995b-46d5-84a0-f74f2ec2b5ba" containerName="kube-state-metrics" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.621845 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a53486c1-995b-46d5-84a0-f74f2ec2b5ba" containerName="kube-state-metrics" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.622721 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.622821 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.625954 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.630107 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.750028 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a09c4697-168d-4762-8169-e36de57bfd7c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a09c4697-168d-4762-8169-e36de57bfd7c\") " pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.750081 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09c4697-168d-4762-8169-e36de57bfd7c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a09c4697-168d-4762-8169-e36de57bfd7c\") " pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.750151 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-529dk\" (UniqueName: \"kubernetes.io/projected/a09c4697-168d-4762-8169-e36de57bfd7c-kube-api-access-529dk\") pod \"kube-state-metrics-0\" (UID: \"a09c4697-168d-4762-8169-e36de57bfd7c\") " pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.750182 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a09c4697-168d-4762-8169-e36de57bfd7c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a09c4697-168d-4762-8169-e36de57bfd7c\") " pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.851870 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a09c4697-168d-4762-8169-e36de57bfd7c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a09c4697-168d-4762-8169-e36de57bfd7c\") " pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.851915 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09c4697-168d-4762-8169-e36de57bfd7c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a09c4697-168d-4762-8169-e36de57bfd7c\") " pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.851962 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-529dk\" (UniqueName: \"kubernetes.io/projected/a09c4697-168d-4762-8169-e36de57bfd7c-kube-api-access-529dk\") pod \"kube-state-metrics-0\" (UID: \"a09c4697-168d-4762-8169-e36de57bfd7c\") " pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.851996 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a09c4697-168d-4762-8169-e36de57bfd7c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a09c4697-168d-4762-8169-e36de57bfd7c\") " pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.856637 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a09c4697-168d-4762-8169-e36de57bfd7c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a09c4697-168d-4762-8169-e36de57bfd7c\") " pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.856701 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a09c4697-168d-4762-8169-e36de57bfd7c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a09c4697-168d-4762-8169-e36de57bfd7c\") " pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.871971 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09c4697-168d-4762-8169-e36de57bfd7c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a09c4697-168d-4762-8169-e36de57bfd7c\") " pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.878775 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-529dk\" (UniqueName: \"kubernetes.io/projected/a09c4697-168d-4762-8169-e36de57bfd7c-kube-api-access-529dk\") pod \"kube-state-metrics-0\" (UID: \"a09c4697-168d-4762-8169-e36de57bfd7c\") " pod="openstack/kube-state-metrics-0" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.901751 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a53486c1-995b-46d5-84a0-f74f2ec2b5ba" path="/var/lib/kubelet/pods/a53486c1-995b-46d5-84a0-f74f2ec2b5ba/volumes" Oct 03 18:35:02 crc kubenswrapper[4835]: I1003 18:35:02.943603 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 18:35:03 crc kubenswrapper[4835]: I1003 18:35:03.432506 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 18:35:03 crc kubenswrapper[4835]: I1003 18:35:03.543602 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a09c4697-168d-4762-8169-e36de57bfd7c","Type":"ContainerStarted","Data":"f09dde8ff6b0f93cbcc7325123e46b49fb2a1201bf3e715d8e5bb590aa679db4"} Oct 03 18:35:03 crc kubenswrapper[4835]: I1003 18:35:03.765363 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:35:03 crc kubenswrapper[4835]: I1003 18:35:03.765691 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="ceilometer-central-agent" containerID="cri-o://a340fd7619fbe376031c0d7d73d4c9b53a33d1e323d355f67e7a454a8c325b69" gracePeriod=30 Oct 03 18:35:03 crc kubenswrapper[4835]: I1003 18:35:03.765757 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="proxy-httpd" containerID="cri-o://c781100bfeed11c33023ae039f6c8460894c23edcdd15494ffdf06817ab87dcb" gracePeriod=30 Oct 03 18:35:03 crc kubenswrapper[4835]: I1003 18:35:03.765793 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="ceilometer-notification-agent" containerID="cri-o://297027a217607a798f2318c367cfdbec57bc52f5554b2e9687796c1ae994e877" gracePeriod=30 Oct 03 18:35:03 crc kubenswrapper[4835]: I1003 18:35:03.765793 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="sg-core" containerID="cri-o://356926a0203256759bebd6f40cfe285c0084a92e52991134700c2c40a7507b90" gracePeriod=30 Oct 03 18:35:04 crc kubenswrapper[4835]: I1003 18:35:04.557029 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a09c4697-168d-4762-8169-e36de57bfd7c","Type":"ContainerStarted","Data":"5986168645dbc6d777d4196a260f97f0152eb2d4063aacf5a33ed8a44f441d3c"} Oct 03 18:35:04 crc kubenswrapper[4835]: I1003 18:35:04.557410 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 18:35:04 crc kubenswrapper[4835]: I1003 18:35:04.560152 4835 generic.go:334] "Generic (PLEG): container finished" podID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerID="c781100bfeed11c33023ae039f6c8460894c23edcdd15494ffdf06817ab87dcb" exitCode=0 Oct 03 18:35:04 crc kubenswrapper[4835]: I1003 18:35:04.560176 4835 generic.go:334] "Generic (PLEG): container finished" podID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerID="356926a0203256759bebd6f40cfe285c0084a92e52991134700c2c40a7507b90" exitCode=2 Oct 03 18:35:04 crc kubenswrapper[4835]: I1003 18:35:04.560186 4835 generic.go:334] "Generic (PLEG): container finished" podID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerID="a340fd7619fbe376031c0d7d73d4c9b53a33d1e323d355f67e7a454a8c325b69" exitCode=0 Oct 03 18:35:04 crc kubenswrapper[4835]: I1003 18:35:04.560340 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e75e68d-44d9-475b-8a16-d8b9bf770678","Type":"ContainerDied","Data":"c781100bfeed11c33023ae039f6c8460894c23edcdd15494ffdf06817ab87dcb"} Oct 03 18:35:04 crc kubenswrapper[4835]: I1003 18:35:04.560422 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e75e68d-44d9-475b-8a16-d8b9bf770678","Type":"ContainerDied","Data":"356926a0203256759bebd6f40cfe285c0084a92e52991134700c2c40a7507b90"} Oct 03 18:35:04 crc kubenswrapper[4835]: I1003 18:35:04.560488 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e75e68d-44d9-475b-8a16-d8b9bf770678","Type":"ContainerDied","Data":"a340fd7619fbe376031c0d7d73d4c9b53a33d1e323d355f67e7a454a8c325b69"} Oct 03 18:35:04 crc kubenswrapper[4835]: I1003 18:35:04.573436 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.2122399440000002 podStartE2EDuration="2.573418745s" podCreationTimestamp="2025-10-03 18:35:02 +0000 UTC" firstStartedPulling="2025-10-03 18:35:03.428660686 +0000 UTC m=+1245.144601558" lastFinishedPulling="2025-10-03 18:35:03.789839487 +0000 UTC m=+1245.505780359" observedRunningTime="2025-10-03 18:35:04.571504923 +0000 UTC m=+1246.287445805" watchObservedRunningTime="2025-10-03 18:35:04.573418745 +0000 UTC m=+1246.289359617" Oct 03 18:35:04 crc kubenswrapper[4835]: I1003 18:35:04.897405 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 18:35:05 crc kubenswrapper[4835]: I1003 18:35:05.919061 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 03 18:35:06 crc kubenswrapper[4835]: I1003 18:35:06.952430 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 18:35:06 crc kubenswrapper[4835]: I1003 18:35:06.952476 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 18:35:07 crc kubenswrapper[4835]: I1003 18:35:07.927777 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 18:35:07 crc kubenswrapper[4835]: I1003 18:35:07.928168 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 18:35:08 crc kubenswrapper[4835]: I1003 18:35:08.003243 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="195a095e-91be-47e7-97de-a0dc42c301f7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 18:35:08 crc kubenswrapper[4835]: I1003 18:35:08.004327 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="195a095e-91be-47e7-97de-a0dc42c301f7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 18:35:08 crc kubenswrapper[4835]: I1003 18:35:08.968272 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c9dba733-1e73-4ed3-a029-53ff385f53a5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 18:35:09 crc kubenswrapper[4835]: I1003 18:35:09.010294 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c9dba733-1e73-4ed3-a029-53ff385f53a5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 18:35:09 crc kubenswrapper[4835]: I1003 18:35:09.897410 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 18:35:09 crc kubenswrapper[4835]: I1003 18:35:09.943506 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.161087 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.313620 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e75e68d-44d9-475b-8a16-d8b9bf770678-run-httpd\") pod \"3e75e68d-44d9-475b-8a16-d8b9bf770678\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.313732 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-config-data\") pod \"3e75e68d-44d9-475b-8a16-d8b9bf770678\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.313763 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-sg-core-conf-yaml\") pod \"3e75e68d-44d9-475b-8a16-d8b9bf770678\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.313820 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-combined-ca-bundle\") pod \"3e75e68d-44d9-475b-8a16-d8b9bf770678\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.313867 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e75e68d-44d9-475b-8a16-d8b9bf770678-log-httpd\") pod \"3e75e68d-44d9-475b-8a16-d8b9bf770678\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.313956 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-scripts\") pod \"3e75e68d-44d9-475b-8a16-d8b9bf770678\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.314094 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp5tp\" (UniqueName: \"kubernetes.io/projected/3e75e68d-44d9-475b-8a16-d8b9bf770678-kube-api-access-sp5tp\") pod \"3e75e68d-44d9-475b-8a16-d8b9bf770678\" (UID: \"3e75e68d-44d9-475b-8a16-d8b9bf770678\") " Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.315454 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e75e68d-44d9-475b-8a16-d8b9bf770678-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3e75e68d-44d9-475b-8a16-d8b9bf770678" (UID: "3e75e68d-44d9-475b-8a16-d8b9bf770678"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.315503 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e75e68d-44d9-475b-8a16-d8b9bf770678-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3e75e68d-44d9-475b-8a16-d8b9bf770678" (UID: "3e75e68d-44d9-475b-8a16-d8b9bf770678"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.320397 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-scripts" (OuterVolumeSpecName: "scripts") pod "3e75e68d-44d9-475b-8a16-d8b9bf770678" (UID: "3e75e68d-44d9-475b-8a16-d8b9bf770678"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.332101 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e75e68d-44d9-475b-8a16-d8b9bf770678-kube-api-access-sp5tp" (OuterVolumeSpecName: "kube-api-access-sp5tp") pod "3e75e68d-44d9-475b-8a16-d8b9bf770678" (UID: "3e75e68d-44d9-475b-8a16-d8b9bf770678"). InnerVolumeSpecName "kube-api-access-sp5tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.352796 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3e75e68d-44d9-475b-8a16-d8b9bf770678" (UID: "3e75e68d-44d9-475b-8a16-d8b9bf770678"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.406594 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e75e68d-44d9-475b-8a16-d8b9bf770678" (UID: "3e75e68d-44d9-475b-8a16-d8b9bf770678"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.417899 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e75e68d-44d9-475b-8a16-d8b9bf770678-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.417933 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.417943 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.417951 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e75e68d-44d9-475b-8a16-d8b9bf770678-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.417961 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.417971 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp5tp\" (UniqueName: \"kubernetes.io/projected/3e75e68d-44d9-475b-8a16-d8b9bf770678-kube-api-access-sp5tp\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.432227 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-config-data" (OuterVolumeSpecName: "config-data") pod "3e75e68d-44d9-475b-8a16-d8b9bf770678" (UID: "3e75e68d-44d9-475b-8a16-d8b9bf770678"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.519946 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e75e68d-44d9-475b-8a16-d8b9bf770678-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.647426 4835 generic.go:334] "Generic (PLEG): container finished" podID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerID="297027a217607a798f2318c367cfdbec57bc52f5554b2e9687796c1ae994e877" exitCode=0 Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.647491 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.647545 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e75e68d-44d9-475b-8a16-d8b9bf770678","Type":"ContainerDied","Data":"297027a217607a798f2318c367cfdbec57bc52f5554b2e9687796c1ae994e877"} Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.647574 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e75e68d-44d9-475b-8a16-d8b9bf770678","Type":"ContainerDied","Data":"a756470c0a90b277bc60e8f3c2818a9f8191567d64244261159c93ee48f949bc"} Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.647593 4835 scope.go:117] "RemoveContainer" containerID="c781100bfeed11c33023ae039f6c8460894c23edcdd15494ffdf06817ab87dcb" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.669865 4835 scope.go:117] "RemoveContainer" containerID="356926a0203256759bebd6f40cfe285c0084a92e52991134700c2c40a7507b90" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.683652 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.687703 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.696452 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.696556 4835 scope.go:117] "RemoveContainer" containerID="297027a217607a798f2318c367cfdbec57bc52f5554b2e9687796c1ae994e877" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.712672 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:35:10 crc kubenswrapper[4835]: E1003 18:35:10.713394 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="sg-core" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.713490 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="sg-core" Oct 03 18:35:10 crc kubenswrapper[4835]: E1003 18:35:10.713563 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="proxy-httpd" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.713624 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="proxy-httpd" Oct 03 18:35:10 crc kubenswrapper[4835]: E1003 18:35:10.713683 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="ceilometer-notification-agent" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.713736 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="ceilometer-notification-agent" Oct 03 18:35:10 crc kubenswrapper[4835]: E1003 18:35:10.713802 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="ceilometer-central-agent" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.713855 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="ceilometer-central-agent" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.714166 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="ceilometer-central-agent" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.714349 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="proxy-httpd" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.714440 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="sg-core" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.714516 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" containerName="ceilometer-notification-agent" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.716519 4835 scope.go:117] "RemoveContainer" containerID="a340fd7619fbe376031c0d7d73d4c9b53a33d1e323d355f67e7a454a8c325b69" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.724749 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.727463 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.727506 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.727464 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.732122 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.738303 4835 scope.go:117] "RemoveContainer" containerID="c781100bfeed11c33023ae039f6c8460894c23edcdd15494ffdf06817ab87dcb" Oct 03 18:35:10 crc kubenswrapper[4835]: E1003 18:35:10.739103 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c781100bfeed11c33023ae039f6c8460894c23edcdd15494ffdf06817ab87dcb\": container with ID starting with c781100bfeed11c33023ae039f6c8460894c23edcdd15494ffdf06817ab87dcb not found: ID does not exist" containerID="c781100bfeed11c33023ae039f6c8460894c23edcdd15494ffdf06817ab87dcb" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.739151 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c781100bfeed11c33023ae039f6c8460894c23edcdd15494ffdf06817ab87dcb"} err="failed to get container status \"c781100bfeed11c33023ae039f6c8460894c23edcdd15494ffdf06817ab87dcb\": rpc error: code = NotFound desc = could not find container \"c781100bfeed11c33023ae039f6c8460894c23edcdd15494ffdf06817ab87dcb\": container with ID starting with c781100bfeed11c33023ae039f6c8460894c23edcdd15494ffdf06817ab87dcb not found: ID does not exist" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.739184 4835 scope.go:117] "RemoveContainer" containerID="356926a0203256759bebd6f40cfe285c0084a92e52991134700c2c40a7507b90" Oct 03 18:35:10 crc kubenswrapper[4835]: E1003 18:35:10.740968 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"356926a0203256759bebd6f40cfe285c0084a92e52991134700c2c40a7507b90\": container with ID starting with 356926a0203256759bebd6f40cfe285c0084a92e52991134700c2c40a7507b90 not found: ID does not exist" containerID="356926a0203256759bebd6f40cfe285c0084a92e52991134700c2c40a7507b90" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.741012 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"356926a0203256759bebd6f40cfe285c0084a92e52991134700c2c40a7507b90"} err="failed to get container status \"356926a0203256759bebd6f40cfe285c0084a92e52991134700c2c40a7507b90\": rpc error: code = NotFound desc = could not find container \"356926a0203256759bebd6f40cfe285c0084a92e52991134700c2c40a7507b90\": container with ID starting with 356926a0203256759bebd6f40cfe285c0084a92e52991134700c2c40a7507b90 not found: ID does not exist" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.741038 4835 scope.go:117] "RemoveContainer" containerID="297027a217607a798f2318c367cfdbec57bc52f5554b2e9687796c1ae994e877" Oct 03 18:35:10 crc kubenswrapper[4835]: E1003 18:35:10.741450 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"297027a217607a798f2318c367cfdbec57bc52f5554b2e9687796c1ae994e877\": container with ID starting with 297027a217607a798f2318c367cfdbec57bc52f5554b2e9687796c1ae994e877 not found: ID does not exist" containerID="297027a217607a798f2318c367cfdbec57bc52f5554b2e9687796c1ae994e877" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.741482 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"297027a217607a798f2318c367cfdbec57bc52f5554b2e9687796c1ae994e877"} err="failed to get container status \"297027a217607a798f2318c367cfdbec57bc52f5554b2e9687796c1ae994e877\": rpc error: code = NotFound desc = could not find container \"297027a217607a798f2318c367cfdbec57bc52f5554b2e9687796c1ae994e877\": container with ID starting with 297027a217607a798f2318c367cfdbec57bc52f5554b2e9687796c1ae994e877 not found: ID does not exist" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.741502 4835 scope.go:117] "RemoveContainer" containerID="a340fd7619fbe376031c0d7d73d4c9b53a33d1e323d355f67e7a454a8c325b69" Oct 03 18:35:10 crc kubenswrapper[4835]: E1003 18:35:10.742890 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a340fd7619fbe376031c0d7d73d4c9b53a33d1e323d355f67e7a454a8c325b69\": container with ID starting with a340fd7619fbe376031c0d7d73d4c9b53a33d1e323d355f67e7a454a8c325b69 not found: ID does not exist" containerID="a340fd7619fbe376031c0d7d73d4c9b53a33d1e323d355f67e7a454a8c325b69" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.742911 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a340fd7619fbe376031c0d7d73d4c9b53a33d1e323d355f67e7a454a8c325b69"} err="failed to get container status \"a340fd7619fbe376031c0d7d73d4c9b53a33d1e323d355f67e7a454a8c325b69\": rpc error: code = NotFound desc = could not find container \"a340fd7619fbe376031c0d7d73d4c9b53a33d1e323d355f67e7a454a8c325b69\": container with ID starting with a340fd7619fbe376031c0d7d73d4c9b53a33d1e323d355f67e7a454a8c325b69 not found: ID does not exist" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.826741 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c608fb-d66b-44a0-8a50-930c83040b5c-log-httpd\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.827128 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccvkf\" (UniqueName: \"kubernetes.io/projected/22c608fb-d66b-44a0-8a50-930c83040b5c-kube-api-access-ccvkf\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.827156 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.827175 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-scripts\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.827213 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.827251 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-config-data\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.827310 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c608fb-d66b-44a0-8a50-930c83040b5c-run-httpd\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.827326 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.888770 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e75e68d-44d9-475b-8a16-d8b9bf770678" path="/var/lib/kubelet/pods/3e75e68d-44d9-475b-8a16-d8b9bf770678/volumes" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.928589 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c608fb-d66b-44a0-8a50-930c83040b5c-run-httpd\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.928640 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.928701 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c608fb-d66b-44a0-8a50-930c83040b5c-log-httpd\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.928763 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccvkf\" (UniqueName: \"kubernetes.io/projected/22c608fb-d66b-44a0-8a50-930c83040b5c-kube-api-access-ccvkf\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.928786 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.928810 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-scripts\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.928849 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.928888 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-config-data\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.930307 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c608fb-d66b-44a0-8a50-930c83040b5c-log-httpd\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.930540 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c608fb-d66b-44a0-8a50-930c83040b5c-run-httpd\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.933634 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.934372 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.935143 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-scripts\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.944800 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-config-data\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.945531 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:10 crc kubenswrapper[4835]: I1003 18:35:10.953311 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccvkf\" (UniqueName: \"kubernetes.io/projected/22c608fb-d66b-44a0-8a50-930c83040b5c-kube-api-access-ccvkf\") pod \"ceilometer-0\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " pod="openstack/ceilometer-0" Oct 03 18:35:11 crc kubenswrapper[4835]: I1003 18:35:11.048306 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:35:12 crc kubenswrapper[4835]: I1003 18:35:11.509567 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:35:12 crc kubenswrapper[4835]: W1003 18:35:11.524765 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22c608fb_d66b_44a0_8a50_930c83040b5c.slice/crio-24b2c7122c24eb215c9a529d6c698a3473363437cc95e3c195bfdb056d7b509e WatchSource:0}: Error finding container 24b2c7122c24eb215c9a529d6c698a3473363437cc95e3c195bfdb056d7b509e: Status 404 returned error can't find the container with id 24b2c7122c24eb215c9a529d6c698a3473363437cc95e3c195bfdb056d7b509e Oct 03 18:35:12 crc kubenswrapper[4835]: I1003 18:35:11.658932 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c608fb-d66b-44a0-8a50-930c83040b5c","Type":"ContainerStarted","Data":"24b2c7122c24eb215c9a529d6c698a3473363437cc95e3c195bfdb056d7b509e"} Oct 03 18:35:12 crc kubenswrapper[4835]: I1003 18:35:12.674448 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c608fb-d66b-44a0-8a50-930c83040b5c","Type":"ContainerStarted","Data":"f1a83e0acbe67fc7ab8a5d9446116938d8f5efa57d8c9a97bbc06e20c310f635"} Oct 03 18:35:12 crc kubenswrapper[4835]: I1003 18:35:12.675171 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c608fb-d66b-44a0-8a50-930c83040b5c","Type":"ContainerStarted","Data":"d6a7a5a8a0115496d28d5035ccaa4405bf14455163d8752cea7f1d893f5d5e85"} Oct 03 18:35:12 crc kubenswrapper[4835]: I1003 18:35:12.950574 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 18:35:13 crc kubenswrapper[4835]: I1003 18:35:13.686131 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c608fb-d66b-44a0-8a50-930c83040b5c","Type":"ContainerStarted","Data":"48f1a858f5c03455f6fe8336219583cc3313591c545d9e6af0d4648988eee2e9"} Oct 03 18:35:15 crc kubenswrapper[4835]: I1003 18:35:15.711693 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c608fb-d66b-44a0-8a50-930c83040b5c","Type":"ContainerStarted","Data":"e187d5c9482b2da051394b1d20d01dc84c62e84d786b0afc593ee9d21083e20c"} Oct 03 18:35:15 crc kubenswrapper[4835]: I1003 18:35:15.712118 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 18:35:16 crc kubenswrapper[4835]: I1003 18:35:16.959308 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 18:35:16 crc kubenswrapper[4835]: I1003 18:35:16.960408 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 18:35:16 crc kubenswrapper[4835]: I1003 18:35:16.965765 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 18:35:16 crc kubenswrapper[4835]: I1003 18:35:16.988678 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.007586483 podStartE2EDuration="6.988658414s" podCreationTimestamp="2025-10-03 18:35:10 +0000 UTC" firstStartedPulling="2025-10-03 18:35:11.528303693 +0000 UTC m=+1253.244244585" lastFinishedPulling="2025-10-03 18:35:14.509375634 +0000 UTC m=+1256.225316516" observedRunningTime="2025-10-03 18:35:15.735115324 +0000 UTC m=+1257.451056196" watchObservedRunningTime="2025-10-03 18:35:16.988658414 +0000 UTC m=+1258.704599296" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.637261 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.732563 4835 generic.go:334] "Generic (PLEG): container finished" podID="e4eb715e-8684-471a-9383-f3be1ce5be53" containerID="68d1701767f131fd92108bcbb6b0653e53df73c66fdf48876d8f697a7fa137e9" exitCode=137 Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.732639 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.732689 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e4eb715e-8684-471a-9383-f3be1ce5be53","Type":"ContainerDied","Data":"68d1701767f131fd92108bcbb6b0653e53df73c66fdf48876d8f697a7fa137e9"} Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.732716 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e4eb715e-8684-471a-9383-f3be1ce5be53","Type":"ContainerDied","Data":"bb1b0fcafe138d6c998cdc473db6c1ab4d547aee0fb700eb653409d3ba09210d"} Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.732733 4835 scope.go:117] "RemoveContainer" containerID="68d1701767f131fd92108bcbb6b0653e53df73c66fdf48876d8f697a7fa137e9" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.738877 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.760132 4835 scope.go:117] "RemoveContainer" containerID="68d1701767f131fd92108bcbb6b0653e53df73c66fdf48876d8f697a7fa137e9" Oct 03 18:35:17 crc kubenswrapper[4835]: E1003 18:35:17.762424 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d1701767f131fd92108bcbb6b0653e53df73c66fdf48876d8f697a7fa137e9\": container with ID starting with 68d1701767f131fd92108bcbb6b0653e53df73c66fdf48876d8f697a7fa137e9 not found: ID does not exist" containerID="68d1701767f131fd92108bcbb6b0653e53df73c66fdf48876d8f697a7fa137e9" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.762463 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d1701767f131fd92108bcbb6b0653e53df73c66fdf48876d8f697a7fa137e9"} err="failed to get container status \"68d1701767f131fd92108bcbb6b0653e53df73c66fdf48876d8f697a7fa137e9\": rpc error: code = NotFound desc = could not find container \"68d1701767f131fd92108bcbb6b0653e53df73c66fdf48876d8f697a7fa137e9\": container with ID starting with 68d1701767f131fd92108bcbb6b0653e53df73c66fdf48876d8f697a7fa137e9 not found: ID does not exist" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.772497 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4eb715e-8684-471a-9383-f3be1ce5be53-config-data\") pod \"e4eb715e-8684-471a-9383-f3be1ce5be53\" (UID: \"e4eb715e-8684-471a-9383-f3be1ce5be53\") " Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.772620 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4eb715e-8684-471a-9383-f3be1ce5be53-combined-ca-bundle\") pod \"e4eb715e-8684-471a-9383-f3be1ce5be53\" (UID: \"e4eb715e-8684-471a-9383-f3be1ce5be53\") " Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.773369 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6tfl\" (UniqueName: \"kubernetes.io/projected/e4eb715e-8684-471a-9383-f3be1ce5be53-kube-api-access-w6tfl\") pod \"e4eb715e-8684-471a-9383-f3be1ce5be53\" (UID: \"e4eb715e-8684-471a-9383-f3be1ce5be53\") " Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.777793 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4eb715e-8684-471a-9383-f3be1ce5be53-kube-api-access-w6tfl" (OuterVolumeSpecName: "kube-api-access-w6tfl") pod "e4eb715e-8684-471a-9383-f3be1ce5be53" (UID: "e4eb715e-8684-471a-9383-f3be1ce5be53"). InnerVolumeSpecName "kube-api-access-w6tfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.812560 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4eb715e-8684-471a-9383-f3be1ce5be53-config-data" (OuterVolumeSpecName: "config-data") pod "e4eb715e-8684-471a-9383-f3be1ce5be53" (UID: "e4eb715e-8684-471a-9383-f3be1ce5be53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.827130 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4eb715e-8684-471a-9383-f3be1ce5be53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4eb715e-8684-471a-9383-f3be1ce5be53" (UID: "e4eb715e-8684-471a-9383-f3be1ce5be53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.875375 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4eb715e-8684-471a-9383-f3be1ce5be53-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.875409 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4eb715e-8684-471a-9383-f3be1ce5be53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.875422 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6tfl\" (UniqueName: \"kubernetes.io/projected/e4eb715e-8684-471a-9383-f3be1ce5be53-kube-api-access-w6tfl\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.934430 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.934884 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.936254 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 18:35:17 crc kubenswrapper[4835]: I1003 18:35:17.940324 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.062286 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.070025 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.082833 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 18:35:18 crc kubenswrapper[4835]: E1003 18:35:18.083686 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4eb715e-8684-471a-9383-f3be1ce5be53" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.083709 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4eb715e-8684-471a-9383-f3be1ce5be53" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.083938 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4eb715e-8684-471a-9383-f3be1ce5be53" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.090160 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.101847 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.102129 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.102386 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.107337 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.183312 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a86172-d2db-4c6d-92a3-7a747955f3a4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.183490 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5a86172-d2db-4c6d-92a3-7a747955f3a4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.183552 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a86172-d2db-4c6d-92a3-7a747955f3a4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.183597 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kss6h\" (UniqueName: \"kubernetes.io/projected/d5a86172-d2db-4c6d-92a3-7a747955f3a4-kube-api-access-kss6h\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.183663 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a86172-d2db-4c6d-92a3-7a747955f3a4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.285258 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5a86172-d2db-4c6d-92a3-7a747955f3a4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.285433 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a86172-d2db-4c6d-92a3-7a747955f3a4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.285584 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kss6h\" (UniqueName: \"kubernetes.io/projected/d5a86172-d2db-4c6d-92a3-7a747955f3a4-kube-api-access-kss6h\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.285673 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a86172-d2db-4c6d-92a3-7a747955f3a4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.285723 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a86172-d2db-4c6d-92a3-7a747955f3a4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.289091 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a86172-d2db-4c6d-92a3-7a747955f3a4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.289538 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5a86172-d2db-4c6d-92a3-7a747955f3a4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.289832 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5a86172-d2db-4c6d-92a3-7a747955f3a4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.289878 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5a86172-d2db-4c6d-92a3-7a747955f3a4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.302822 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kss6h\" (UniqueName: \"kubernetes.io/projected/d5a86172-d2db-4c6d-92a3-7a747955f3a4-kube-api-access-kss6h\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5a86172-d2db-4c6d-92a3-7a747955f3a4\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.407383 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.745727 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.751632 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.888744 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4eb715e-8684-471a-9383-f3be1ce5be53" path="/var/lib/kubelet/pods/e4eb715e-8684-471a-9383-f3be1ce5be53/volumes" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.894696 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.969455 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c84bdb669-6frp9"] Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.971290 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:18 crc kubenswrapper[4835]: I1003 18:35:18.989902 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c84bdb669-6frp9"] Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.108668 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-config\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.108733 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg88n\" (UniqueName: \"kubernetes.io/projected/7729606d-e769-4a52-9ebc-830f6a79d3ff-kube-api-access-lg88n\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.108937 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-dns-swift-storage-0\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.108974 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-ovsdbserver-sb\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.109010 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-dns-svc\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.109045 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-ovsdbserver-nb\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.210808 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-dns-svc\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.210878 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-ovsdbserver-nb\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.211133 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-config\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.211201 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg88n\" (UniqueName: \"kubernetes.io/projected/7729606d-e769-4a52-9ebc-830f6a79d3ff-kube-api-access-lg88n\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.211297 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-dns-swift-storage-0\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.211326 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-ovsdbserver-sb\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.212156 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-ovsdbserver-sb\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.213131 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-ovsdbserver-nb\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.213293 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-dns-svc\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.213816 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-config\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.214462 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-dns-swift-storage-0\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.235418 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg88n\" (UniqueName: \"kubernetes.io/projected/7729606d-e769-4a52-9ebc-830f6a79d3ff-kube-api-access-lg88n\") pod \"dnsmasq-dns-6c84bdb669-6frp9\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.411942 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.760864 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5a86172-d2db-4c6d-92a3-7a747955f3a4","Type":"ContainerStarted","Data":"800b5016ae5fe2e4c0ef3768f868d3bee620d6268f13bdd9916b107727c2e87e"} Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.761249 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5a86172-d2db-4c6d-92a3-7a747955f3a4","Type":"ContainerStarted","Data":"8470f7ef6503d261dcba309c8799a7bba76f92d5b3df8f28ec74b71bb8f70618"} Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.783504 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.78348761 podStartE2EDuration="1.78348761s" podCreationTimestamp="2025-10-03 18:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:35:19.776588848 +0000 UTC m=+1261.492529720" watchObservedRunningTime="2025-10-03 18:35:19.78348761 +0000 UTC m=+1261.499428472" Oct 03 18:35:19 crc kubenswrapper[4835]: I1003 18:35:19.902344 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c84bdb669-6frp9"] Oct 03 18:35:19 crc kubenswrapper[4835]: W1003 18:35:19.928493 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7729606d_e769_4a52_9ebc_830f6a79d3ff.slice/crio-0c924cfd5e69dd92b6c09b4bd9541dd1b5aa23d17d9f2eeddd23e09e387832c8 WatchSource:0}: Error finding container 0c924cfd5e69dd92b6c09b4bd9541dd1b5aa23d17d9f2eeddd23e09e387832c8: Status 404 returned error can't find the container with id 0c924cfd5e69dd92b6c09b4bd9541dd1b5aa23d17d9f2eeddd23e09e387832c8 Oct 03 18:35:20 crc kubenswrapper[4835]: I1003 18:35:20.770022 4835 generic.go:334] "Generic (PLEG): container finished" podID="7729606d-e769-4a52-9ebc-830f6a79d3ff" containerID="ea0f25410c43d25ed9f3938f40875dc0392ab904da83c5fd6f8db244733fc46c" exitCode=0 Oct 03 18:35:20 crc kubenswrapper[4835]: I1003 18:35:20.770145 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" event={"ID":"7729606d-e769-4a52-9ebc-830f6a79d3ff","Type":"ContainerDied","Data":"ea0f25410c43d25ed9f3938f40875dc0392ab904da83c5fd6f8db244733fc46c"} Oct 03 18:35:20 crc kubenswrapper[4835]: I1003 18:35:20.770331 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" event={"ID":"7729606d-e769-4a52-9ebc-830f6a79d3ff","Type":"ContainerStarted","Data":"0c924cfd5e69dd92b6c09b4bd9541dd1b5aa23d17d9f2eeddd23e09e387832c8"} Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.190546 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.191263 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="ceilometer-central-agent" containerID="cri-o://d6a7a5a8a0115496d28d5035ccaa4405bf14455163d8752cea7f1d893f5d5e85" gracePeriod=30 Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.191324 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="proxy-httpd" containerID="cri-o://e187d5c9482b2da051394b1d20d01dc84c62e84d786b0afc593ee9d21083e20c" gracePeriod=30 Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.191338 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="ceilometer-notification-agent" containerID="cri-o://f1a83e0acbe67fc7ab8a5d9446116938d8f5efa57d8c9a97bbc06e20c310f635" gracePeriod=30 Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.191332 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="sg-core" containerID="cri-o://48f1a858f5c03455f6fe8336219583cc3313591c545d9e6af0d4648988eee2e9" gracePeriod=30 Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.451909 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.784268 4835 generic.go:334] "Generic (PLEG): container finished" podID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerID="e187d5c9482b2da051394b1d20d01dc84c62e84d786b0afc593ee9d21083e20c" exitCode=0 Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.784299 4835 generic.go:334] "Generic (PLEG): container finished" podID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerID="48f1a858f5c03455f6fe8336219583cc3313591c545d9e6af0d4648988eee2e9" exitCode=2 Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.784307 4835 generic.go:334] "Generic (PLEG): container finished" podID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerID="d6a7a5a8a0115496d28d5035ccaa4405bf14455163d8752cea7f1d893f5d5e85" exitCode=0 Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.784370 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c608fb-d66b-44a0-8a50-930c83040b5c","Type":"ContainerDied","Data":"e187d5c9482b2da051394b1d20d01dc84c62e84d786b0afc593ee9d21083e20c"} Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.784415 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c608fb-d66b-44a0-8a50-930c83040b5c","Type":"ContainerDied","Data":"48f1a858f5c03455f6fe8336219583cc3313591c545d9e6af0d4648988eee2e9"} Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.784428 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c608fb-d66b-44a0-8a50-930c83040b5c","Type":"ContainerDied","Data":"d6a7a5a8a0115496d28d5035ccaa4405bf14455163d8752cea7f1d893f5d5e85"} Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.788207 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c9dba733-1e73-4ed3-a029-53ff385f53a5" containerName="nova-api-log" containerID="cri-o://b40bdcf4296a048744855fbf2661efc8ff2b4a7771b713fc5528404c5a2d2bf3" gracePeriod=30 Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.789256 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" event={"ID":"7729606d-e769-4a52-9ebc-830f6a79d3ff","Type":"ContainerStarted","Data":"d3f58fdab93f62baa6af045ff1a0c57a51e2e37dc61cdde574d4ff5feb2a0b15"} Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.789289 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:21 crc kubenswrapper[4835]: I1003 18:35:21.789330 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c9dba733-1e73-4ed3-a029-53ff385f53a5" containerName="nova-api-api" containerID="cri-o://3e91860e255938939441bf01f1505e8ed1aec50033d1a72f2c184ada6d06b3ef" gracePeriod=30 Oct 03 18:35:22 crc kubenswrapper[4835]: I1003 18:35:22.802496 4835 generic.go:334] "Generic (PLEG): container finished" podID="c9dba733-1e73-4ed3-a029-53ff385f53a5" containerID="3e91860e255938939441bf01f1505e8ed1aec50033d1a72f2c184ada6d06b3ef" exitCode=0 Oct 03 18:35:22 crc kubenswrapper[4835]: I1003 18:35:22.802822 4835 generic.go:334] "Generic (PLEG): container finished" podID="c9dba733-1e73-4ed3-a029-53ff385f53a5" containerID="b40bdcf4296a048744855fbf2661efc8ff2b4a7771b713fc5528404c5a2d2bf3" exitCode=143 Oct 03 18:35:22 crc kubenswrapper[4835]: I1003 18:35:22.803680 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9dba733-1e73-4ed3-a029-53ff385f53a5","Type":"ContainerDied","Data":"3e91860e255938939441bf01f1505e8ed1aec50033d1a72f2c184ada6d06b3ef"} Oct 03 18:35:22 crc kubenswrapper[4835]: I1003 18:35:22.803706 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9dba733-1e73-4ed3-a029-53ff385f53a5","Type":"ContainerDied","Data":"b40bdcf4296a048744855fbf2661efc8ff2b4a7771b713fc5528404c5a2d2bf3"} Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.185922 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.207538 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" podStartSLOduration=5.207518211 podStartE2EDuration="5.207518211s" podCreationTimestamp="2025-10-03 18:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:35:21.812989381 +0000 UTC m=+1263.528930253" watchObservedRunningTime="2025-10-03 18:35:23.207518211 +0000 UTC m=+1264.923459083" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.308037 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj57j\" (UniqueName: \"kubernetes.io/projected/c9dba733-1e73-4ed3-a029-53ff385f53a5-kube-api-access-jj57j\") pod \"c9dba733-1e73-4ed3-a029-53ff385f53a5\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.308384 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9dba733-1e73-4ed3-a029-53ff385f53a5-combined-ca-bundle\") pod \"c9dba733-1e73-4ed3-a029-53ff385f53a5\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.308527 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9dba733-1e73-4ed3-a029-53ff385f53a5-logs\") pod \"c9dba733-1e73-4ed3-a029-53ff385f53a5\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.308549 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9dba733-1e73-4ed3-a029-53ff385f53a5-config-data\") pod \"c9dba733-1e73-4ed3-a029-53ff385f53a5\" (UID: \"c9dba733-1e73-4ed3-a029-53ff385f53a5\") " Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.309052 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9dba733-1e73-4ed3-a029-53ff385f53a5-logs" (OuterVolumeSpecName: "logs") pod "c9dba733-1e73-4ed3-a029-53ff385f53a5" (UID: "c9dba733-1e73-4ed3-a029-53ff385f53a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.315474 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9dba733-1e73-4ed3-a029-53ff385f53a5-kube-api-access-jj57j" (OuterVolumeSpecName: "kube-api-access-jj57j") pod "c9dba733-1e73-4ed3-a029-53ff385f53a5" (UID: "c9dba733-1e73-4ed3-a029-53ff385f53a5"). InnerVolumeSpecName "kube-api-access-jj57j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.357262 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9dba733-1e73-4ed3-a029-53ff385f53a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9dba733-1e73-4ed3-a029-53ff385f53a5" (UID: "c9dba733-1e73-4ed3-a029-53ff385f53a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.395670 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9dba733-1e73-4ed3-a029-53ff385f53a5-config-data" (OuterVolumeSpecName: "config-data") pod "c9dba733-1e73-4ed3-a029-53ff385f53a5" (UID: "c9dba733-1e73-4ed3-a029-53ff385f53a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.408289 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.410098 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj57j\" (UniqueName: \"kubernetes.io/projected/c9dba733-1e73-4ed3-a029-53ff385f53a5-kube-api-access-jj57j\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.410118 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9dba733-1e73-4ed3-a029-53ff385f53a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.410127 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9dba733-1e73-4ed3-a029-53ff385f53a5-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.410138 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9dba733-1e73-4ed3-a029-53ff385f53a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.814263 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9dba733-1e73-4ed3-a029-53ff385f53a5","Type":"ContainerDied","Data":"0601e3b1ed5946b1e57cc1dc1c2a86d65be50470f36818ea9eba2cce57f87957"} Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.814438 4835 scope.go:117] "RemoveContainer" containerID="3e91860e255938939441bf01f1505e8ed1aec50033d1a72f2c184ada6d06b3ef" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.814302 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.817553 4835 generic.go:334] "Generic (PLEG): container finished" podID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerID="f1a83e0acbe67fc7ab8a5d9446116938d8f5efa57d8c9a97bbc06e20c310f635" exitCode=0 Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.817591 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c608fb-d66b-44a0-8a50-930c83040b5c","Type":"ContainerDied","Data":"f1a83e0acbe67fc7ab8a5d9446116938d8f5efa57d8c9a97bbc06e20c310f635"} Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.817613 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"22c608fb-d66b-44a0-8a50-930c83040b5c","Type":"ContainerDied","Data":"24b2c7122c24eb215c9a529d6c698a3473363437cc95e3c195bfdb056d7b509e"} Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.817626 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24b2c7122c24eb215c9a529d6c698a3473363437cc95e3c195bfdb056d7b509e" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.848750 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.859370 4835 scope.go:117] "RemoveContainer" containerID="b40bdcf4296a048744855fbf2661efc8ff2b4a7771b713fc5528404c5a2d2bf3" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.867127 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.892908 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.917842 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 18:35:23 crc kubenswrapper[4835]: E1003 18:35:23.918343 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9dba733-1e73-4ed3-a029-53ff385f53a5" containerName="nova-api-log" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918362 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9dba733-1e73-4ed3-a029-53ff385f53a5" containerName="nova-api-log" Oct 03 18:35:23 crc kubenswrapper[4835]: E1003 18:35:23.918375 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="sg-core" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918383 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="sg-core" Oct 03 18:35:23 crc kubenswrapper[4835]: E1003 18:35:23.918396 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="ceilometer-central-agent" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918402 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="ceilometer-central-agent" Oct 03 18:35:23 crc kubenswrapper[4835]: E1003 18:35:23.918414 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="ceilometer-notification-agent" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918420 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="ceilometer-notification-agent" Oct 03 18:35:23 crc kubenswrapper[4835]: E1003 18:35:23.918433 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9dba733-1e73-4ed3-a029-53ff385f53a5" containerName="nova-api-api" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918438 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9dba733-1e73-4ed3-a029-53ff385f53a5" containerName="nova-api-api" Oct 03 18:35:23 crc kubenswrapper[4835]: E1003 18:35:23.918452 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="proxy-httpd" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918458 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="proxy-httpd" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918502 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-sg-core-conf-yaml\") pod \"22c608fb-d66b-44a0-8a50-930c83040b5c\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918588 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c608fb-d66b-44a0-8a50-930c83040b5c-log-httpd\") pod \"22c608fb-d66b-44a0-8a50-930c83040b5c\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918659 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccvkf\" (UniqueName: \"kubernetes.io/projected/22c608fb-d66b-44a0-8a50-930c83040b5c-kube-api-access-ccvkf\") pod \"22c608fb-d66b-44a0-8a50-930c83040b5c\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918666 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="sg-core" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918686 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="ceilometer-notification-agent" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918701 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="ceilometer-central-agent" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918710 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9dba733-1e73-4ed3-a029-53ff385f53a5" containerName="nova-api-api" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918718 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9dba733-1e73-4ed3-a029-53ff385f53a5" containerName="nova-api-log" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918731 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" containerName="proxy-httpd" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918742 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-config-data\") pod \"22c608fb-d66b-44a0-8a50-930c83040b5c\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918763 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-ceilometer-tls-certs\") pod \"22c608fb-d66b-44a0-8a50-930c83040b5c\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918782 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-combined-ca-bundle\") pod \"22c608fb-d66b-44a0-8a50-930c83040b5c\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918808 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-scripts\") pod \"22c608fb-d66b-44a0-8a50-930c83040b5c\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.918914 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c608fb-d66b-44a0-8a50-930c83040b5c-run-httpd\") pod \"22c608fb-d66b-44a0-8a50-930c83040b5c\" (UID: \"22c608fb-d66b-44a0-8a50-930c83040b5c\") " Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.919229 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22c608fb-d66b-44a0-8a50-930c83040b5c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "22c608fb-d66b-44a0-8a50-930c83040b5c" (UID: "22c608fb-d66b-44a0-8a50-930c83040b5c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.919556 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22c608fb-d66b-44a0-8a50-930c83040b5c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "22c608fb-d66b-44a0-8a50-930c83040b5c" (UID: "22c608fb-d66b-44a0-8a50-930c83040b5c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.919813 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c608fb-d66b-44a0-8a50-930c83040b5c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.919842 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.919844 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/22c608fb-d66b-44a0-8a50-930c83040b5c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.923333 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c608fb-d66b-44a0-8a50-930c83040b5c-kube-api-access-ccvkf" (OuterVolumeSpecName: "kube-api-access-ccvkf") pod "22c608fb-d66b-44a0-8a50-930c83040b5c" (UID: "22c608fb-d66b-44a0-8a50-930c83040b5c"). InnerVolumeSpecName "kube-api-access-ccvkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.923503 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.923586 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.925616 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.933769 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.943401 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-scripts" (OuterVolumeSpecName: "scripts") pod "22c608fb-d66b-44a0-8a50-930c83040b5c" (UID: "22c608fb-d66b-44a0-8a50-930c83040b5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:23 crc kubenswrapper[4835]: I1003 18:35:23.959311 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "22c608fb-d66b-44a0-8a50-930c83040b5c" (UID: "22c608fb-d66b-44a0-8a50-930c83040b5c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.000229 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "22c608fb-d66b-44a0-8a50-930c83040b5c" (UID: "22c608fb-d66b-44a0-8a50-930c83040b5c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.012871 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22c608fb-d66b-44a0-8a50-930c83040b5c" (UID: "22c608fb-d66b-44a0-8a50-930c83040b5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.021829 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmfx2\" (UniqueName: \"kubernetes.io/projected/1636db3a-cfc9-4fa2-9083-6485dc58c81e-kube-api-access-lmfx2\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.021898 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.021972 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.022037 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-public-tls-certs\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.022101 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1636db3a-cfc9-4fa2-9083-6485dc58c81e-logs\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.022124 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-config-data\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.022845 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.022878 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccvkf\" (UniqueName: \"kubernetes.io/projected/22c608fb-d66b-44a0-8a50-930c83040b5c-kube-api-access-ccvkf\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.023132 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.023164 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.023178 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.042747 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-config-data" (OuterVolumeSpecName: "config-data") pod "22c608fb-d66b-44a0-8a50-930c83040b5c" (UID: "22c608fb-d66b-44a0-8a50-930c83040b5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.125602 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.125699 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.125749 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-public-tls-certs\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.125782 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1636db3a-cfc9-4fa2-9083-6485dc58c81e-logs\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.125797 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-config-data\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.125882 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmfx2\" (UniqueName: \"kubernetes.io/projected/1636db3a-cfc9-4fa2-9083-6485dc58c81e-kube-api-access-lmfx2\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.126006 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c608fb-d66b-44a0-8a50-930c83040b5c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.126473 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1636db3a-cfc9-4fa2-9083-6485dc58c81e-logs\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.130737 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.130773 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-config-data\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.131024 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.134200 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-public-tls-certs\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.144162 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmfx2\" (UniqueName: \"kubernetes.io/projected/1636db3a-cfc9-4fa2-9083-6485dc58c81e-kube-api-access-lmfx2\") pod \"nova-api-0\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.408748 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.827289 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.863857 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.894322 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9dba733-1e73-4ed3-a029-53ff385f53a5" path="/var/lib/kubelet/pods/c9dba733-1e73-4ed3-a029-53ff385f53a5/volumes" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.895296 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.895336 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.904702 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.907311 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.910266 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.910356 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.910480 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 18:35:24 crc kubenswrapper[4835]: I1003 18:35:24.912835 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.041932 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.041992 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-scripts\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.042170 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-config-data\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.042431 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.042822 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d393ce1-3377-4099-9948-423066ae1ee5-run-httpd\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.042929 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d393ce1-3377-4099-9948-423066ae1ee5-log-httpd\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.043053 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z78q\" (UniqueName: \"kubernetes.io/projected/1d393ce1-3377-4099-9948-423066ae1ee5-kube-api-access-4z78q\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.043146 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.145097 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d393ce1-3377-4099-9948-423066ae1ee5-run-httpd\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.145397 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d393ce1-3377-4099-9948-423066ae1ee5-log-httpd\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.145433 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z78q\" (UniqueName: \"kubernetes.io/projected/1d393ce1-3377-4099-9948-423066ae1ee5-kube-api-access-4z78q\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.145458 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.145498 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.145524 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-scripts\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.145577 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-config-data\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.145604 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.145920 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d393ce1-3377-4099-9948-423066ae1ee5-run-httpd\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.146327 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d393ce1-3377-4099-9948-423066ae1ee5-log-httpd\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.150457 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.150979 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-config-data\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.151045 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-scripts\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.160233 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.160650 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d393ce1-3377-4099-9948-423066ae1ee5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.163471 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z78q\" (UniqueName: \"kubernetes.io/projected/1d393ce1-3377-4099-9948-423066ae1ee5-kube-api-access-4z78q\") pod \"ceilometer-0\" (UID: \"1d393ce1-3377-4099-9948-423066ae1ee5\") " pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.317378 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.804792 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.837111 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d393ce1-3377-4099-9948-423066ae1ee5","Type":"ContainerStarted","Data":"78b15b78bd9a6c503a09fd88fe7fcae061b69065cae53814a337b63355d8281e"} Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.839323 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1636db3a-cfc9-4fa2-9083-6485dc58c81e","Type":"ContainerStarted","Data":"7dd9c023a79e667efc6ee3d27b31d6de62d7e17f9f62d1a1ee75341d4fc184e4"} Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.839367 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1636db3a-cfc9-4fa2-9083-6485dc58c81e","Type":"ContainerStarted","Data":"8413f0942b6e19958e7d0d9320b2a9bd9abc82bec8c04dc261fa1f078832660a"} Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.839379 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1636db3a-cfc9-4fa2-9083-6485dc58c81e","Type":"ContainerStarted","Data":"3954713609d4fdb6fa40cc06b37fe2662f3b183c8046a83e02aea41d4c5cba7a"} Oct 03 18:35:25 crc kubenswrapper[4835]: I1003 18:35:25.862677 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8626560960000003 podStartE2EDuration="2.862656096s" podCreationTimestamp="2025-10-03 18:35:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:35:25.85782847 +0000 UTC m=+1267.573769372" watchObservedRunningTime="2025-10-03 18:35:25.862656096 +0000 UTC m=+1267.578596998" Oct 03 18:35:26 crc kubenswrapper[4835]: I1003 18:35:26.849258 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d393ce1-3377-4099-9948-423066ae1ee5","Type":"ContainerStarted","Data":"ab36062752fb0b03cb09a524490da2dced3101e9ac80c67c520cdb5c5caa1522"} Oct 03 18:35:26 crc kubenswrapper[4835]: I1003 18:35:26.849988 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d393ce1-3377-4099-9948-423066ae1ee5","Type":"ContainerStarted","Data":"56859935656afd2893060674adcb8ee7d45a7bf785d7a09511f09cb51e81d05f"} Oct 03 18:35:26 crc kubenswrapper[4835]: I1003 18:35:26.888465 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c608fb-d66b-44a0-8a50-930c83040b5c" path="/var/lib/kubelet/pods/22c608fb-d66b-44a0-8a50-930c83040b5c/volumes" Oct 03 18:35:27 crc kubenswrapper[4835]: I1003 18:35:27.865924 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d393ce1-3377-4099-9948-423066ae1ee5","Type":"ContainerStarted","Data":"d9a1e887ad084ac42a2b14b25ecd58d6c5e23ea02ac546207fbff74983596a26"} Oct 03 18:35:28 crc kubenswrapper[4835]: I1003 18:35:28.408655 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:28 crc kubenswrapper[4835]: I1003 18:35:28.427483 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:28 crc kubenswrapper[4835]: I1003 18:35:28.900903 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1d393ce1-3377-4099-9948-423066ae1ee5","Type":"ContainerStarted","Data":"4f222e96b8ee5d183c4579ce4abc76116528ba6b24c446e59aff2b192a3a6223"} Oct 03 18:35:28 crc kubenswrapper[4835]: I1003 18:35:28.914888 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 03 18:35:28 crc kubenswrapper[4835]: I1003 18:35:28.977726 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.637034372 podStartE2EDuration="4.977695563s" podCreationTimestamp="2025-10-03 18:35:24 +0000 UTC" firstStartedPulling="2025-10-03 18:35:25.797472303 +0000 UTC m=+1267.513413175" lastFinishedPulling="2025-10-03 18:35:28.138133454 +0000 UTC m=+1269.854074366" observedRunningTime="2025-10-03 18:35:28.962506459 +0000 UTC m=+1270.678447331" watchObservedRunningTime="2025-10-03 18:35:28.977695563 +0000 UTC m=+1270.693636455" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.103144 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-f49gn"] Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.104923 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.107250 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.107487 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.117392 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-f49gn"] Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.241791 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-scripts\") pod \"nova-cell1-cell-mapping-f49gn\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.241841 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-f49gn\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.242129 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch9kb\" (UniqueName: \"kubernetes.io/projected/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-kube-api-access-ch9kb\") pod \"nova-cell1-cell-mapping-f49gn\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.242190 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-config-data\") pod \"nova-cell1-cell-mapping-f49gn\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.343925 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch9kb\" (UniqueName: \"kubernetes.io/projected/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-kube-api-access-ch9kb\") pod \"nova-cell1-cell-mapping-f49gn\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.343976 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-config-data\") pod \"nova-cell1-cell-mapping-f49gn\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.344039 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-scripts\") pod \"nova-cell1-cell-mapping-f49gn\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.344056 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-f49gn\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.349945 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-f49gn\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.350786 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-scripts\") pod \"nova-cell1-cell-mapping-f49gn\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.350799 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-config-data\") pod \"nova-cell1-cell-mapping-f49gn\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.365621 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch9kb\" (UniqueName: \"kubernetes.io/projected/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-kube-api-access-ch9kb\") pod \"nova-cell1-cell-mapping-f49gn\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.413459 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.436130 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.476124 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f55868c59-5xsq8"] Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.476353 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" podUID="996f5308-25a4-41d7-a335-f255cd014871" containerName="dnsmasq-dns" containerID="cri-o://f90cd51b5dc91ed8d3c7b76cb5f4f97e99225eeed2f3ca1f2e4a6e6aafa83cd8" gracePeriod=10 Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.890213 4835 generic.go:334] "Generic (PLEG): container finished" podID="996f5308-25a4-41d7-a335-f255cd014871" containerID="f90cd51b5dc91ed8d3c7b76cb5f4f97e99225eeed2f3ca1f2e4a6e6aafa83cd8" exitCode=0 Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.890268 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" event={"ID":"996f5308-25a4-41d7-a335-f255cd014871","Type":"ContainerDied","Data":"f90cd51b5dc91ed8d3c7b76cb5f4f97e99225eeed2f3ca1f2e4a6e6aafa83cd8"} Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.890698 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 18:35:29 crc kubenswrapper[4835]: I1003 18:35:29.975865 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-f49gn"] Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.082712 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.165319 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-dns-svc\") pod \"996f5308-25a4-41d7-a335-f255cd014871\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.165736 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-config\") pod \"996f5308-25a4-41d7-a335-f255cd014871\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.165775 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-dns-swift-storage-0\") pod \"996f5308-25a4-41d7-a335-f255cd014871\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.165815 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-ovsdbserver-sb\") pod \"996f5308-25a4-41d7-a335-f255cd014871\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.165833 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbb2s\" (UniqueName: \"kubernetes.io/projected/996f5308-25a4-41d7-a335-f255cd014871-kube-api-access-xbb2s\") pod \"996f5308-25a4-41d7-a335-f255cd014871\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.165878 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-ovsdbserver-nb\") pod \"996f5308-25a4-41d7-a335-f255cd014871\" (UID: \"996f5308-25a4-41d7-a335-f255cd014871\") " Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.175213 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/996f5308-25a4-41d7-a335-f255cd014871-kube-api-access-xbb2s" (OuterVolumeSpecName: "kube-api-access-xbb2s") pod "996f5308-25a4-41d7-a335-f255cd014871" (UID: "996f5308-25a4-41d7-a335-f255cd014871"). InnerVolumeSpecName "kube-api-access-xbb2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.270348 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbb2s\" (UniqueName: \"kubernetes.io/projected/996f5308-25a4-41d7-a335-f255cd014871-kube-api-access-xbb2s\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.311821 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-config" (OuterVolumeSpecName: "config") pod "996f5308-25a4-41d7-a335-f255cd014871" (UID: "996f5308-25a4-41d7-a335-f255cd014871"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.315999 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "996f5308-25a4-41d7-a335-f255cd014871" (UID: "996f5308-25a4-41d7-a335-f255cd014871"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.321706 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "996f5308-25a4-41d7-a335-f255cd014871" (UID: "996f5308-25a4-41d7-a335-f255cd014871"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.323951 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "996f5308-25a4-41d7-a335-f255cd014871" (UID: "996f5308-25a4-41d7-a335-f255cd014871"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.326197 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "996f5308-25a4-41d7-a335-f255cd014871" (UID: "996f5308-25a4-41d7-a335-f255cd014871"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.374315 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.374346 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.374359 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.374369 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.374379 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/996f5308-25a4-41d7-a335-f255cd014871-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.910399 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f49gn" event={"ID":"34c9c48d-f7af-44a7-ad66-ecaeefca60a5","Type":"ContainerStarted","Data":"1d9f3c19bfad5d80e939598b95d6962719777161a191deecbf5ecbb7586c159a"} Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.911290 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f49gn" event={"ID":"34c9c48d-f7af-44a7-ad66-ecaeefca60a5","Type":"ContainerStarted","Data":"7b9265150a1ea36ad104bed9b5fa25a86262b701aae12ada9a54af6a4fe081ce"} Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.917462 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.917507 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f55868c59-5xsq8" event={"ID":"996f5308-25a4-41d7-a335-f255cd014871","Type":"ContainerDied","Data":"9c6f36a01ba2a30ee63c0500f0d50db2bbe2e8cb4938cb858d3374ee01cea314"} Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.917548 4835 scope.go:117] "RemoveContainer" containerID="f90cd51b5dc91ed8d3c7b76cb5f4f97e99225eeed2f3ca1f2e4a6e6aafa83cd8" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.945413 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-f49gn" podStartSLOduration=1.9453949640000001 podStartE2EDuration="1.945394964s" podCreationTimestamp="2025-10-03 18:35:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:35:30.929375192 +0000 UTC m=+1272.645316064" watchObservedRunningTime="2025-10-03 18:35:30.945394964 +0000 UTC m=+1272.661335836" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.948411 4835 scope.go:117] "RemoveContainer" containerID="3930ed02ba81d90d039f9144c83eb9f8200b06eabee3a3fbe09fdc92d6926d28" Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.958432 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f55868c59-5xsq8"] Oct 03 18:35:30 crc kubenswrapper[4835]: I1003 18:35:30.967813 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f55868c59-5xsq8"] Oct 03 18:35:32 crc kubenswrapper[4835]: I1003 18:35:32.902496 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="996f5308-25a4-41d7-a335-f255cd014871" path="/var/lib/kubelet/pods/996f5308-25a4-41d7-a335-f255cd014871/volumes" Oct 03 18:35:34 crc kubenswrapper[4835]: I1003 18:35:34.409631 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 18:35:34 crc kubenswrapper[4835]: I1003 18:35:34.409680 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 18:35:34 crc kubenswrapper[4835]: I1003 18:35:34.972178 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f49gn" event={"ID":"34c9c48d-f7af-44a7-ad66-ecaeefca60a5","Type":"ContainerDied","Data":"1d9f3c19bfad5d80e939598b95d6962719777161a191deecbf5ecbb7586c159a"} Oct 03 18:35:34 crc kubenswrapper[4835]: I1003 18:35:34.972189 4835 generic.go:334] "Generic (PLEG): container finished" podID="34c9c48d-f7af-44a7-ad66-ecaeefca60a5" containerID="1d9f3c19bfad5d80e939598b95d6962719777161a191deecbf5ecbb7586c159a" exitCode=0 Oct 03 18:35:35 crc kubenswrapper[4835]: I1003 18:35:35.426252 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1636db3a-cfc9-4fa2-9083-6485dc58c81e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 18:35:35 crc kubenswrapper[4835]: I1003 18:35:35.426244 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1636db3a-cfc9-4fa2-9083-6485dc58c81e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.427611 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.504848 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch9kb\" (UniqueName: \"kubernetes.io/projected/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-kube-api-access-ch9kb\") pod \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.504930 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-scripts\") pod \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.505124 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-combined-ca-bundle\") pod \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.505149 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-config-data\") pod \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\" (UID: \"34c9c48d-f7af-44a7-ad66-ecaeefca60a5\") " Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.518244 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-scripts" (OuterVolumeSpecName: "scripts") pod "34c9c48d-f7af-44a7-ad66-ecaeefca60a5" (UID: "34c9c48d-f7af-44a7-ad66-ecaeefca60a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.518318 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-kube-api-access-ch9kb" (OuterVolumeSpecName: "kube-api-access-ch9kb") pod "34c9c48d-f7af-44a7-ad66-ecaeefca60a5" (UID: "34c9c48d-f7af-44a7-ad66-ecaeefca60a5"). InnerVolumeSpecName "kube-api-access-ch9kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.539202 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-config-data" (OuterVolumeSpecName: "config-data") pod "34c9c48d-f7af-44a7-ad66-ecaeefca60a5" (UID: "34c9c48d-f7af-44a7-ad66-ecaeefca60a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.561669 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34c9c48d-f7af-44a7-ad66-ecaeefca60a5" (UID: "34c9c48d-f7af-44a7-ad66-ecaeefca60a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.608305 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.608344 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.608356 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch9kb\" (UniqueName: \"kubernetes.io/projected/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-kube-api-access-ch9kb\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.608370 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c9c48d-f7af-44a7-ad66-ecaeefca60a5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.996516 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f49gn" event={"ID":"34c9c48d-f7af-44a7-ad66-ecaeefca60a5","Type":"ContainerDied","Data":"7b9265150a1ea36ad104bed9b5fa25a86262b701aae12ada9a54af6a4fe081ce"} Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.996861 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b9265150a1ea36ad104bed9b5fa25a86262b701aae12ada9a54af6a4fe081ce" Oct 03 18:35:36 crc kubenswrapper[4835]: I1003 18:35:36.996581 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f49gn" Oct 03 18:35:37 crc kubenswrapper[4835]: I1003 18:35:37.177431 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:35:37 crc kubenswrapper[4835]: I1003 18:35:37.177646 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1636db3a-cfc9-4fa2-9083-6485dc58c81e" containerName="nova-api-log" containerID="cri-o://8413f0942b6e19958e7d0d9320b2a9bd9abc82bec8c04dc261fa1f078832660a" gracePeriod=30 Oct 03 18:35:37 crc kubenswrapper[4835]: I1003 18:35:37.178133 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1636db3a-cfc9-4fa2-9083-6485dc58c81e" containerName="nova-api-api" containerID="cri-o://7dd9c023a79e667efc6ee3d27b31d6de62d7e17f9f62d1a1ee75341d4fc184e4" gracePeriod=30 Oct 03 18:35:37 crc kubenswrapper[4835]: I1003 18:35:37.258145 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:35:37 crc kubenswrapper[4835]: I1003 18:35:37.258465 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="195a095e-91be-47e7-97de-a0dc42c301f7" containerName="nova-metadata-log" containerID="cri-o://e3ca1784094827ceb926c99764374ee5fdcba194fb4aea72a1439281ac9157d4" gracePeriod=30 Oct 03 18:35:37 crc kubenswrapper[4835]: I1003 18:35:37.258790 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="195a095e-91be-47e7-97de-a0dc42c301f7" containerName="nova-metadata-metadata" containerID="cri-o://aff09693aa259e7dad1f626c7160906cb2b6e3ccdca0cb010c99eba4ca6e5a44" gracePeriod=30 Oct 03 18:35:37 crc kubenswrapper[4835]: I1003 18:35:37.279825 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:35:37 crc kubenswrapper[4835]: I1003 18:35:37.280029 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="949e50ea-2b76-4506-8fbb-fb58efd9f020" containerName="nova-scheduler-scheduler" containerID="cri-o://f077c3054fece741a7b9f5660d2d8290d0de7ad4704eae303ac506c3628747fc" gracePeriod=30 Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.009752 4835 generic.go:334] "Generic (PLEG): container finished" podID="195a095e-91be-47e7-97de-a0dc42c301f7" containerID="e3ca1784094827ceb926c99764374ee5fdcba194fb4aea72a1439281ac9157d4" exitCode=143 Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.009845 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"195a095e-91be-47e7-97de-a0dc42c301f7","Type":"ContainerDied","Data":"e3ca1784094827ceb926c99764374ee5fdcba194fb4aea72a1439281ac9157d4"} Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.014043 4835 generic.go:334] "Generic (PLEG): container finished" podID="1636db3a-cfc9-4fa2-9083-6485dc58c81e" containerID="8413f0942b6e19958e7d0d9320b2a9bd9abc82bec8c04dc261fa1f078832660a" exitCode=143 Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.014100 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1636db3a-cfc9-4fa2-9083-6485dc58c81e","Type":"ContainerDied","Data":"8413f0942b6e19958e7d0d9320b2a9bd9abc82bec8c04dc261fa1f078832660a"} Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.545462 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.646488 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-nova-metadata-tls-certs\") pod \"195a095e-91be-47e7-97de-a0dc42c301f7\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.646566 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-combined-ca-bundle\") pod \"195a095e-91be-47e7-97de-a0dc42c301f7\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.646645 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-config-data\") pod \"195a095e-91be-47e7-97de-a0dc42c301f7\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.646697 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv4lf\" (UniqueName: \"kubernetes.io/projected/195a095e-91be-47e7-97de-a0dc42c301f7-kube-api-access-sv4lf\") pod \"195a095e-91be-47e7-97de-a0dc42c301f7\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.646739 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/195a095e-91be-47e7-97de-a0dc42c301f7-logs\") pod \"195a095e-91be-47e7-97de-a0dc42c301f7\" (UID: \"195a095e-91be-47e7-97de-a0dc42c301f7\") " Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.647654 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/195a095e-91be-47e7-97de-a0dc42c301f7-logs" (OuterVolumeSpecName: "logs") pod "195a095e-91be-47e7-97de-a0dc42c301f7" (UID: "195a095e-91be-47e7-97de-a0dc42c301f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.654317 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/195a095e-91be-47e7-97de-a0dc42c301f7-kube-api-access-sv4lf" (OuterVolumeSpecName: "kube-api-access-sv4lf") pod "195a095e-91be-47e7-97de-a0dc42c301f7" (UID: "195a095e-91be-47e7-97de-a0dc42c301f7"). InnerVolumeSpecName "kube-api-access-sv4lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.682915 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-config-data" (OuterVolumeSpecName: "config-data") pod "195a095e-91be-47e7-97de-a0dc42c301f7" (UID: "195a095e-91be-47e7-97de-a0dc42c301f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.710414 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "195a095e-91be-47e7-97de-a0dc42c301f7" (UID: "195a095e-91be-47e7-97de-a0dc42c301f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.721943 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "195a095e-91be-47e7-97de-a0dc42c301f7" (UID: "195a095e-91be-47e7-97de-a0dc42c301f7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.749333 4835 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.749390 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.749400 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195a095e-91be-47e7-97de-a0dc42c301f7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.749409 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv4lf\" (UniqueName: \"kubernetes.io/projected/195a095e-91be-47e7-97de-a0dc42c301f7-kube-api-access-sv4lf\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.749438 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/195a095e-91be-47e7-97de-a0dc42c301f7-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.776850 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.850209 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-public-tls-certs\") pod \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.850493 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-combined-ca-bundle\") pod \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.850535 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-config-data\") pod \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.850576 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-internal-tls-certs\") pod \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.850614 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmfx2\" (UniqueName: \"kubernetes.io/projected/1636db3a-cfc9-4fa2-9083-6485dc58c81e-kube-api-access-lmfx2\") pod \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.850679 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1636db3a-cfc9-4fa2-9083-6485dc58c81e-logs\") pod \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\" (UID: \"1636db3a-cfc9-4fa2-9083-6485dc58c81e\") " Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.851444 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1636db3a-cfc9-4fa2-9083-6485dc58c81e-logs" (OuterVolumeSpecName: "logs") pod "1636db3a-cfc9-4fa2-9083-6485dc58c81e" (UID: "1636db3a-cfc9-4fa2-9083-6485dc58c81e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.857611 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1636db3a-cfc9-4fa2-9083-6485dc58c81e-kube-api-access-lmfx2" (OuterVolumeSpecName: "kube-api-access-lmfx2") pod "1636db3a-cfc9-4fa2-9083-6485dc58c81e" (UID: "1636db3a-cfc9-4fa2-9083-6485dc58c81e"). InnerVolumeSpecName "kube-api-access-lmfx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.880255 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1636db3a-cfc9-4fa2-9083-6485dc58c81e" (UID: "1636db3a-cfc9-4fa2-9083-6485dc58c81e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.885490 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-config-data" (OuterVolumeSpecName: "config-data") pod "1636db3a-cfc9-4fa2-9083-6485dc58c81e" (UID: "1636db3a-cfc9-4fa2-9083-6485dc58c81e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.905532 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1636db3a-cfc9-4fa2-9083-6485dc58c81e" (UID: "1636db3a-cfc9-4fa2-9083-6485dc58c81e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.927594 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1636db3a-cfc9-4fa2-9083-6485dc58c81e" (UID: "1636db3a-cfc9-4fa2-9083-6485dc58c81e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.954025 4835 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.954088 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.954105 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.954118 4835 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1636db3a-cfc9-4fa2-9083-6485dc58c81e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.954163 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmfx2\" (UniqueName: \"kubernetes.io/projected/1636db3a-cfc9-4fa2-9083-6485dc58c81e-kube-api-access-lmfx2\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:38 crc kubenswrapper[4835]: I1003 18:35:38.954182 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1636db3a-cfc9-4fa2-9083-6485dc58c81e-logs\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.032920 4835 generic.go:334] "Generic (PLEG): container finished" podID="949e50ea-2b76-4506-8fbb-fb58efd9f020" containerID="f077c3054fece741a7b9f5660d2d8290d0de7ad4704eae303ac506c3628747fc" exitCode=0 Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.032969 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"949e50ea-2b76-4506-8fbb-fb58efd9f020","Type":"ContainerDied","Data":"f077c3054fece741a7b9f5660d2d8290d0de7ad4704eae303ac506c3628747fc"} Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.036724 4835 generic.go:334] "Generic (PLEG): container finished" podID="195a095e-91be-47e7-97de-a0dc42c301f7" containerID="aff09693aa259e7dad1f626c7160906cb2b6e3ccdca0cb010c99eba4ca6e5a44" exitCode=0 Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.036769 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"195a095e-91be-47e7-97de-a0dc42c301f7","Type":"ContainerDied","Data":"aff09693aa259e7dad1f626c7160906cb2b6e3ccdca0cb010c99eba4ca6e5a44"} Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.036795 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"195a095e-91be-47e7-97de-a0dc42c301f7","Type":"ContainerDied","Data":"60bac3981298c25d9a431fdb784676c64018342662e66ca7ef80f58306b1d9a0"} Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.036811 4835 scope.go:117] "RemoveContainer" containerID="aff09693aa259e7dad1f626c7160906cb2b6e3ccdca0cb010c99eba4ca6e5a44" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.036928 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.044048 4835 generic.go:334] "Generic (PLEG): container finished" podID="1636db3a-cfc9-4fa2-9083-6485dc58c81e" containerID="7dd9c023a79e667efc6ee3d27b31d6de62d7e17f9f62d1a1ee75341d4fc184e4" exitCode=0 Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.044259 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.045200 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1636db3a-cfc9-4fa2-9083-6485dc58c81e","Type":"ContainerDied","Data":"7dd9c023a79e667efc6ee3d27b31d6de62d7e17f9f62d1a1ee75341d4fc184e4"} Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.045270 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1636db3a-cfc9-4fa2-9083-6485dc58c81e","Type":"ContainerDied","Data":"3954713609d4fdb6fa40cc06b37fe2662f3b183c8046a83e02aea41d4c5cba7a"} Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.087499 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.106017 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.111725 4835 scope.go:117] "RemoveContainer" containerID="e3ca1784094827ceb926c99764374ee5fdcba194fb4aea72a1439281ac9157d4" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.120240 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.135863 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148112 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 18:35:39 crc kubenswrapper[4835]: E1003 18:35:39.148534 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195a095e-91be-47e7-97de-a0dc42c301f7" containerName="nova-metadata-metadata" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148553 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="195a095e-91be-47e7-97de-a0dc42c301f7" containerName="nova-metadata-metadata" Oct 03 18:35:39 crc kubenswrapper[4835]: E1003 18:35:39.148570 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c9c48d-f7af-44a7-ad66-ecaeefca60a5" containerName="nova-manage" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148577 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c9c48d-f7af-44a7-ad66-ecaeefca60a5" containerName="nova-manage" Oct 03 18:35:39 crc kubenswrapper[4835]: E1003 18:35:39.148589 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195a095e-91be-47e7-97de-a0dc42c301f7" containerName="nova-metadata-log" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148596 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="195a095e-91be-47e7-97de-a0dc42c301f7" containerName="nova-metadata-log" Oct 03 18:35:39 crc kubenswrapper[4835]: E1003 18:35:39.148609 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996f5308-25a4-41d7-a335-f255cd014871" containerName="dnsmasq-dns" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148614 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="996f5308-25a4-41d7-a335-f255cd014871" containerName="dnsmasq-dns" Oct 03 18:35:39 crc kubenswrapper[4835]: E1003 18:35:39.148625 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996f5308-25a4-41d7-a335-f255cd014871" containerName="init" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148631 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="996f5308-25a4-41d7-a335-f255cd014871" containerName="init" Oct 03 18:35:39 crc kubenswrapper[4835]: E1003 18:35:39.148638 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1636db3a-cfc9-4fa2-9083-6485dc58c81e" containerName="nova-api-api" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148644 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1636db3a-cfc9-4fa2-9083-6485dc58c81e" containerName="nova-api-api" Oct 03 18:35:39 crc kubenswrapper[4835]: E1003 18:35:39.148671 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1636db3a-cfc9-4fa2-9083-6485dc58c81e" containerName="nova-api-log" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148677 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1636db3a-cfc9-4fa2-9083-6485dc58c81e" containerName="nova-api-log" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148869 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="996f5308-25a4-41d7-a335-f255cd014871" containerName="dnsmasq-dns" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148885 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="195a095e-91be-47e7-97de-a0dc42c301f7" containerName="nova-metadata-metadata" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148901 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1636db3a-cfc9-4fa2-9083-6485dc58c81e" containerName="nova-api-api" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148912 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="34c9c48d-f7af-44a7-ad66-ecaeefca60a5" containerName="nova-manage" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148926 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="195a095e-91be-47e7-97de-a0dc42c301f7" containerName="nova-metadata-log" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.148937 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1636db3a-cfc9-4fa2-9083-6485dc58c81e" containerName="nova-api-log" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.152636 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.153638 4835 scope.go:117] "RemoveContainer" containerID="aff09693aa259e7dad1f626c7160906cb2b6e3ccdca0cb010c99eba4ca6e5a44" Oct 03 18:35:39 crc kubenswrapper[4835]: E1003 18:35:39.154230 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff09693aa259e7dad1f626c7160906cb2b6e3ccdca0cb010c99eba4ca6e5a44\": container with ID starting with aff09693aa259e7dad1f626c7160906cb2b6e3ccdca0cb010c99eba4ca6e5a44 not found: ID does not exist" containerID="aff09693aa259e7dad1f626c7160906cb2b6e3ccdca0cb010c99eba4ca6e5a44" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.154280 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff09693aa259e7dad1f626c7160906cb2b6e3ccdca0cb010c99eba4ca6e5a44"} err="failed to get container status \"aff09693aa259e7dad1f626c7160906cb2b6e3ccdca0cb010c99eba4ca6e5a44\": rpc error: code = NotFound desc = could not find container \"aff09693aa259e7dad1f626c7160906cb2b6e3ccdca0cb010c99eba4ca6e5a44\": container with ID starting with aff09693aa259e7dad1f626c7160906cb2b6e3ccdca0cb010c99eba4ca6e5a44 not found: ID does not exist" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.154307 4835 scope.go:117] "RemoveContainer" containerID="e3ca1784094827ceb926c99764374ee5fdcba194fb4aea72a1439281ac9157d4" Oct 03 18:35:39 crc kubenswrapper[4835]: E1003 18:35:39.154798 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3ca1784094827ceb926c99764374ee5fdcba194fb4aea72a1439281ac9157d4\": container with ID starting with e3ca1784094827ceb926c99764374ee5fdcba194fb4aea72a1439281ac9157d4 not found: ID does not exist" containerID="e3ca1784094827ceb926c99764374ee5fdcba194fb4aea72a1439281ac9157d4" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.154829 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3ca1784094827ceb926c99764374ee5fdcba194fb4aea72a1439281ac9157d4"} err="failed to get container status \"e3ca1784094827ceb926c99764374ee5fdcba194fb4aea72a1439281ac9157d4\": rpc error: code = NotFound desc = could not find container \"e3ca1784094827ceb926c99764374ee5fdcba194fb4aea72a1439281ac9157d4\": container with ID starting with e3ca1784094827ceb926c99764374ee5fdcba194fb4aea72a1439281ac9157d4 not found: ID does not exist" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.154870 4835 scope.go:117] "RemoveContainer" containerID="7dd9c023a79e667efc6ee3d27b31d6de62d7e17f9f62d1a1ee75341d4fc184e4" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.155573 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.155723 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.155807 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.163360 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.180342 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.182462 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.186759 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.187036 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.189552 4835 scope.go:117] "RemoveContainer" containerID="8413f0942b6e19958e7d0d9320b2a9bd9abc82bec8c04dc261fa1f078832660a" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.193393 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.220226 4835 scope.go:117] "RemoveContainer" containerID="7dd9c023a79e667efc6ee3d27b31d6de62d7e17f9f62d1a1ee75341d4fc184e4" Oct 03 18:35:39 crc kubenswrapper[4835]: E1003 18:35:39.220576 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dd9c023a79e667efc6ee3d27b31d6de62d7e17f9f62d1a1ee75341d4fc184e4\": container with ID starting with 7dd9c023a79e667efc6ee3d27b31d6de62d7e17f9f62d1a1ee75341d4fc184e4 not found: ID does not exist" containerID="7dd9c023a79e667efc6ee3d27b31d6de62d7e17f9f62d1a1ee75341d4fc184e4" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.220673 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dd9c023a79e667efc6ee3d27b31d6de62d7e17f9f62d1a1ee75341d4fc184e4"} err="failed to get container status \"7dd9c023a79e667efc6ee3d27b31d6de62d7e17f9f62d1a1ee75341d4fc184e4\": rpc error: code = NotFound desc = could not find container \"7dd9c023a79e667efc6ee3d27b31d6de62d7e17f9f62d1a1ee75341d4fc184e4\": container with ID starting with 7dd9c023a79e667efc6ee3d27b31d6de62d7e17f9f62d1a1ee75341d4fc184e4 not found: ID does not exist" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.220703 4835 scope.go:117] "RemoveContainer" containerID="8413f0942b6e19958e7d0d9320b2a9bd9abc82bec8c04dc261fa1f078832660a" Oct 03 18:35:39 crc kubenswrapper[4835]: E1003 18:35:39.220984 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8413f0942b6e19958e7d0d9320b2a9bd9abc82bec8c04dc261fa1f078832660a\": container with ID starting with 8413f0942b6e19958e7d0d9320b2a9bd9abc82bec8c04dc261fa1f078832660a not found: ID does not exist" containerID="8413f0942b6e19958e7d0d9320b2a9bd9abc82bec8c04dc261fa1f078832660a" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.221124 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8413f0942b6e19958e7d0d9320b2a9bd9abc82bec8c04dc261fa1f078832660a"} err="failed to get container status \"8413f0942b6e19958e7d0d9320b2a9bd9abc82bec8c04dc261fa1f078832660a\": rpc error: code = NotFound desc = could not find container \"8413f0942b6e19958e7d0d9320b2a9bd9abc82bec8c04dc261fa1f078832660a\": container with ID starting with 8413f0942b6e19958e7d0d9320b2a9bd9abc82bec8c04dc261fa1f078832660a not found: ID does not exist" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.260380 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/935cc750-935d-4dc0-8a1c-3bce43a57402-config-data\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.260893 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmkbd\" (UniqueName: \"kubernetes.io/projected/935cc750-935d-4dc0-8a1c-3bce43a57402-kube-api-access-dmkbd\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.260973 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-config-data\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.261181 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.261254 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ph5h\" (UniqueName: \"kubernetes.io/projected/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-kube-api-access-7ph5h\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.261287 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-logs\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.261474 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.261559 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/935cc750-935d-4dc0-8a1c-3bce43a57402-public-tls-certs\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.261740 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/935cc750-935d-4dc0-8a1c-3bce43a57402-internal-tls-certs\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.261786 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/935cc750-935d-4dc0-8a1c-3bce43a57402-logs\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.261840 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935cc750-935d-4dc0-8a1c-3bce43a57402-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.365634 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/935cc750-935d-4dc0-8a1c-3bce43a57402-config-data\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.365713 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmkbd\" (UniqueName: \"kubernetes.io/projected/935cc750-935d-4dc0-8a1c-3bce43a57402-kube-api-access-dmkbd\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.365738 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-config-data\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.365775 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.365791 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ph5h\" (UniqueName: \"kubernetes.io/projected/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-kube-api-access-7ph5h\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.365809 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-logs\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.365822 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.365839 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/935cc750-935d-4dc0-8a1c-3bce43a57402-public-tls-certs\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.365888 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/935cc750-935d-4dc0-8a1c-3bce43a57402-internal-tls-certs\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.365906 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/935cc750-935d-4dc0-8a1c-3bce43a57402-logs\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.365934 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935cc750-935d-4dc0-8a1c-3bce43a57402-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.368713 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-logs\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.369353 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/935cc750-935d-4dc0-8a1c-3bce43a57402-logs\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.386177 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-config-data\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.388725 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.388874 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/935cc750-935d-4dc0-8a1c-3bce43a57402-config-data\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.389385 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/935cc750-935d-4dc0-8a1c-3bce43a57402-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.395747 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/935cc750-935d-4dc0-8a1c-3bce43a57402-public-tls-certs\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.396299 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/935cc750-935d-4dc0-8a1c-3bce43a57402-internal-tls-certs\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.400963 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.407741 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmkbd\" (UniqueName: \"kubernetes.io/projected/935cc750-935d-4dc0-8a1c-3bce43a57402-kube-api-access-dmkbd\") pod \"nova-api-0\" (UID: \"935cc750-935d-4dc0-8a1c-3bce43a57402\") " pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.422848 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ph5h\" (UniqueName: \"kubernetes.io/projected/a67c0bec-a5df-4ffa-a903-6f73e88a0d19-kube-api-access-7ph5h\") pod \"nova-metadata-0\" (UID: \"a67c0bec-a5df-4ffa-a903-6f73e88a0d19\") " pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.479553 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.513599 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.615427 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.672905 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg4zc\" (UniqueName: \"kubernetes.io/projected/949e50ea-2b76-4506-8fbb-fb58efd9f020-kube-api-access-fg4zc\") pod \"949e50ea-2b76-4506-8fbb-fb58efd9f020\" (UID: \"949e50ea-2b76-4506-8fbb-fb58efd9f020\") " Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.673061 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949e50ea-2b76-4506-8fbb-fb58efd9f020-config-data\") pod \"949e50ea-2b76-4506-8fbb-fb58efd9f020\" (UID: \"949e50ea-2b76-4506-8fbb-fb58efd9f020\") " Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.673333 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949e50ea-2b76-4506-8fbb-fb58efd9f020-combined-ca-bundle\") pod \"949e50ea-2b76-4506-8fbb-fb58efd9f020\" (UID: \"949e50ea-2b76-4506-8fbb-fb58efd9f020\") " Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.691293 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949e50ea-2b76-4506-8fbb-fb58efd9f020-kube-api-access-fg4zc" (OuterVolumeSpecName: "kube-api-access-fg4zc") pod "949e50ea-2b76-4506-8fbb-fb58efd9f020" (UID: "949e50ea-2b76-4506-8fbb-fb58efd9f020"). InnerVolumeSpecName "kube-api-access-fg4zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.707475 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949e50ea-2b76-4506-8fbb-fb58efd9f020-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "949e50ea-2b76-4506-8fbb-fb58efd9f020" (UID: "949e50ea-2b76-4506-8fbb-fb58efd9f020"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.713635 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949e50ea-2b76-4506-8fbb-fb58efd9f020-config-data" (OuterVolumeSpecName: "config-data") pod "949e50ea-2b76-4506-8fbb-fb58efd9f020" (UID: "949e50ea-2b76-4506-8fbb-fb58efd9f020"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.778205 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg4zc\" (UniqueName: \"kubernetes.io/projected/949e50ea-2b76-4506-8fbb-fb58efd9f020-kube-api-access-fg4zc\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.778240 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949e50ea-2b76-4506-8fbb-fb58efd9f020-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.778250 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/949e50ea-2b76-4506-8fbb-fb58efd9f020-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:35:39 crc kubenswrapper[4835]: I1003 18:35:39.953341 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 18:35:39 crc kubenswrapper[4835]: W1003 18:35:39.955761 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod935cc750_935d_4dc0_8a1c_3bce43a57402.slice/crio-ae4ddb1c26c61bc27daa69cebd54835880e8f735e5eff341374e70c8486c1280 WatchSource:0}: Error finding container ae4ddb1c26c61bc27daa69cebd54835880e8f735e5eff341374e70c8486c1280: Status 404 returned error can't find the container with id ae4ddb1c26c61bc27daa69cebd54835880e8f735e5eff341374e70c8486c1280 Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.050611 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.055669 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.056136 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"949e50ea-2b76-4506-8fbb-fb58efd9f020","Type":"ContainerDied","Data":"a5c8489082aae1921379274e630bfb7f04de8bc1cc5a89a61b6a41c0b6199b30"} Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.056167 4835 scope.go:117] "RemoveContainer" containerID="f077c3054fece741a7b9f5660d2d8290d0de7ad4704eae303ac506c3628747fc" Oct 03 18:35:40 crc kubenswrapper[4835]: W1003 18:35:40.059656 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda67c0bec_a5df_4ffa_a903_6f73e88a0d19.slice/crio-b29f43add3453f2664d5dbb6f6b6e77a421d9599eac954a80c21d9545e27fff1 WatchSource:0}: Error finding container b29f43add3453f2664d5dbb6f6b6e77a421d9599eac954a80c21d9545e27fff1: Status 404 returned error can't find the container with id b29f43add3453f2664d5dbb6f6b6e77a421d9599eac954a80c21d9545e27fff1 Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.063678 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"935cc750-935d-4dc0-8a1c-3bce43a57402","Type":"ContainerStarted","Data":"ae4ddb1c26c61bc27daa69cebd54835880e8f735e5eff341374e70c8486c1280"} Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.098390 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.118514 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.127837 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:35:40 crc kubenswrapper[4835]: E1003 18:35:40.128216 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949e50ea-2b76-4506-8fbb-fb58efd9f020" containerName="nova-scheduler-scheduler" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.128230 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="949e50ea-2b76-4506-8fbb-fb58efd9f020" containerName="nova-scheduler-scheduler" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.128393 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="949e50ea-2b76-4506-8fbb-fb58efd9f020" containerName="nova-scheduler-scheduler" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.129012 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.134933 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.135354 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.287947 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0f2678-65f3-4fb6-959c-73eee3fbf7de-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bc0f2678-65f3-4fb6-959c-73eee3fbf7de\") " pod="openstack/nova-scheduler-0" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.288011 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0f2678-65f3-4fb6-959c-73eee3fbf7de-config-data\") pod \"nova-scheduler-0\" (UID: \"bc0f2678-65f3-4fb6-959c-73eee3fbf7de\") " pod="openstack/nova-scheduler-0" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.288171 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxhb5\" (UniqueName: \"kubernetes.io/projected/bc0f2678-65f3-4fb6-959c-73eee3fbf7de-kube-api-access-jxhb5\") pod \"nova-scheduler-0\" (UID: \"bc0f2678-65f3-4fb6-959c-73eee3fbf7de\") " pod="openstack/nova-scheduler-0" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.389549 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0f2678-65f3-4fb6-959c-73eee3fbf7de-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bc0f2678-65f3-4fb6-959c-73eee3fbf7de\") " pod="openstack/nova-scheduler-0" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.389599 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0f2678-65f3-4fb6-959c-73eee3fbf7de-config-data\") pod \"nova-scheduler-0\" (UID: \"bc0f2678-65f3-4fb6-959c-73eee3fbf7de\") " pod="openstack/nova-scheduler-0" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.389687 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxhb5\" (UniqueName: \"kubernetes.io/projected/bc0f2678-65f3-4fb6-959c-73eee3fbf7de-kube-api-access-jxhb5\") pod \"nova-scheduler-0\" (UID: \"bc0f2678-65f3-4fb6-959c-73eee3fbf7de\") " pod="openstack/nova-scheduler-0" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.394695 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0f2678-65f3-4fb6-959c-73eee3fbf7de-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bc0f2678-65f3-4fb6-959c-73eee3fbf7de\") " pod="openstack/nova-scheduler-0" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.396883 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0f2678-65f3-4fb6-959c-73eee3fbf7de-config-data\") pod \"nova-scheduler-0\" (UID: \"bc0f2678-65f3-4fb6-959c-73eee3fbf7de\") " pod="openstack/nova-scheduler-0" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.410654 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxhb5\" (UniqueName: \"kubernetes.io/projected/bc0f2678-65f3-4fb6-959c-73eee3fbf7de-kube-api-access-jxhb5\") pod \"nova-scheduler-0\" (UID: \"bc0f2678-65f3-4fb6-959c-73eee3fbf7de\") " pod="openstack/nova-scheduler-0" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.458842 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.891496 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1636db3a-cfc9-4fa2-9083-6485dc58c81e" path="/var/lib/kubelet/pods/1636db3a-cfc9-4fa2-9083-6485dc58c81e/volumes" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.892714 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="195a095e-91be-47e7-97de-a0dc42c301f7" path="/var/lib/kubelet/pods/195a095e-91be-47e7-97de-a0dc42c301f7/volumes" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.894017 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949e50ea-2b76-4506-8fbb-fb58efd9f020" path="/var/lib/kubelet/pods/949e50ea-2b76-4506-8fbb-fb58efd9f020/volumes" Oct 03 18:35:40 crc kubenswrapper[4835]: I1003 18:35:40.928301 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 18:35:40 crc kubenswrapper[4835]: W1003 18:35:40.931142 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc0f2678_65f3_4fb6_959c_73eee3fbf7de.slice/crio-ade6e5c77bb2c4b06ea7fb41c3d952bc564071e7dbc9616aaed4652355d6d722 WatchSource:0}: Error finding container ade6e5c77bb2c4b06ea7fb41c3d952bc564071e7dbc9616aaed4652355d6d722: Status 404 returned error can't find the container with id ade6e5c77bb2c4b06ea7fb41c3d952bc564071e7dbc9616aaed4652355d6d722 Oct 03 18:35:41 crc kubenswrapper[4835]: I1003 18:35:41.074055 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc0f2678-65f3-4fb6-959c-73eee3fbf7de","Type":"ContainerStarted","Data":"ade6e5c77bb2c4b06ea7fb41c3d952bc564071e7dbc9616aaed4652355d6d722"} Oct 03 18:35:41 crc kubenswrapper[4835]: I1003 18:35:41.076299 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a67c0bec-a5df-4ffa-a903-6f73e88a0d19","Type":"ContainerStarted","Data":"2bbe9ebe29aba397f8cb2a95eac945dbf874c5c04062fd927fc93b97cf53aa57"} Oct 03 18:35:41 crc kubenswrapper[4835]: I1003 18:35:41.076343 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a67c0bec-a5df-4ffa-a903-6f73e88a0d19","Type":"ContainerStarted","Data":"f8486176385ef2927a975ee9caa5389b79c24040100f6960b41b27ef6208b7c6"} Oct 03 18:35:41 crc kubenswrapper[4835]: I1003 18:35:41.076355 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a67c0bec-a5df-4ffa-a903-6f73e88a0d19","Type":"ContainerStarted","Data":"b29f43add3453f2664d5dbb6f6b6e77a421d9599eac954a80c21d9545e27fff1"} Oct 03 18:35:41 crc kubenswrapper[4835]: I1003 18:35:41.079464 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"935cc750-935d-4dc0-8a1c-3bce43a57402","Type":"ContainerStarted","Data":"21175bebf6da1e8abb2108ab571d78354d64f2f897d6f2809d9d5272d135c9bd"} Oct 03 18:35:41 crc kubenswrapper[4835]: I1003 18:35:41.079497 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"935cc750-935d-4dc0-8a1c-3bce43a57402","Type":"ContainerStarted","Data":"828d54defc1e245c29d4eeb03ba14acc078ed4a9845503557766ca08d44165c7"} Oct 03 18:35:41 crc kubenswrapper[4835]: I1003 18:35:41.102599 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.102580039 podStartE2EDuration="2.102580039s" podCreationTimestamp="2025-10-03 18:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:35:41.095907692 +0000 UTC m=+1282.811848564" watchObservedRunningTime="2025-10-03 18:35:41.102580039 +0000 UTC m=+1282.818520911" Oct 03 18:35:41 crc kubenswrapper[4835]: I1003 18:35:41.113876 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.113852096 podStartE2EDuration="2.113852096s" podCreationTimestamp="2025-10-03 18:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:35:41.110694677 +0000 UTC m=+1282.826635569" watchObservedRunningTime="2025-10-03 18:35:41.113852096 +0000 UTC m=+1282.829792968" Oct 03 18:35:42 crc kubenswrapper[4835]: I1003 18:35:42.090948 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc0f2678-65f3-4fb6-959c-73eee3fbf7de","Type":"ContainerStarted","Data":"f6b09132be6bb03436d3d5b2425c42a01d66892fa034d2af25f3753ddb68b168"} Oct 03 18:35:42 crc kubenswrapper[4835]: I1003 18:35:42.116555 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.116537321 podStartE2EDuration="2.116537321s" podCreationTimestamp="2025-10-03 18:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:35:42.111317226 +0000 UTC m=+1283.827258128" watchObservedRunningTime="2025-10-03 18:35:42.116537321 +0000 UTC m=+1283.832478193" Oct 03 18:35:44 crc kubenswrapper[4835]: I1003 18:35:44.514128 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 18:35:44 crc kubenswrapper[4835]: I1003 18:35:44.514497 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 18:35:45 crc kubenswrapper[4835]: I1003 18:35:45.460216 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 18:35:49 crc kubenswrapper[4835]: I1003 18:35:49.480351 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 18:35:49 crc kubenswrapper[4835]: I1003 18:35:49.480907 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 18:35:49 crc kubenswrapper[4835]: I1003 18:35:49.514039 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 18:35:49 crc kubenswrapper[4835]: I1003 18:35:49.514109 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 18:35:50 crc kubenswrapper[4835]: I1003 18:35:50.459543 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 18:35:50 crc kubenswrapper[4835]: I1003 18:35:50.488157 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 18:35:50 crc kubenswrapper[4835]: I1003 18:35:50.498364 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="935cc750-935d-4dc0-8a1c-3bce43a57402" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 18:35:50 crc kubenswrapper[4835]: I1003 18:35:50.498388 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="935cc750-935d-4dc0-8a1c-3bce43a57402" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 18:35:50 crc kubenswrapper[4835]: I1003 18:35:50.530334 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a67c0bec-a5df-4ffa-a903-6f73e88a0d19" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.226:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 18:35:50 crc kubenswrapper[4835]: I1003 18:35:50.530349 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a67c0bec-a5df-4ffa-a903-6f73e88a0d19" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.226:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 18:35:51 crc kubenswrapper[4835]: I1003 18:35:51.218148 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 18:35:55 crc kubenswrapper[4835]: I1003 18:35:55.326855 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 18:35:59 crc kubenswrapper[4835]: I1003 18:35:59.488193 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 18:35:59 crc kubenswrapper[4835]: I1003 18:35:59.489613 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 18:35:59 crc kubenswrapper[4835]: I1003 18:35:59.489789 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 18:35:59 crc kubenswrapper[4835]: I1003 18:35:59.501407 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 18:35:59 crc kubenswrapper[4835]: I1003 18:35:59.522367 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 18:35:59 crc kubenswrapper[4835]: I1003 18:35:59.528346 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 18:35:59 crc kubenswrapper[4835]: I1003 18:35:59.534114 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 18:36:00 crc kubenswrapper[4835]: I1003 18:36:00.265605 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 18:36:00 crc kubenswrapper[4835]: I1003 18:36:00.270348 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 18:36:00 crc kubenswrapper[4835]: I1003 18:36:00.285583 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 18:36:05 crc kubenswrapper[4835]: I1003 18:36:05.358631 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:36:05 crc kubenswrapper[4835]: I1003 18:36:05.359094 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:36:08 crc kubenswrapper[4835]: I1003 18:36:08.403704 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 18:36:09 crc kubenswrapper[4835]: I1003 18:36:09.882641 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 18:36:11 crc kubenswrapper[4835]: I1003 18:36:11.769346 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="6fd26bdb-868b-49db-9698-e7c79eea5cef" containerName="rabbitmq" containerID="cri-o://45412f9b1cc503ca1d65287ceb8ce89e2ec70b786e4f3f800c94087f759517e0" gracePeriod=604797 Oct 03 18:36:12 crc kubenswrapper[4835]: I1003 18:36:12.795738 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="2f5f99aa-dba6-465b-866a-1e293ba51685" containerName="rabbitmq" containerID="cri-o://daa7a9cd848675475ad1e8821bcf022f7f368dccbd8b463091be4977cebba766" gracePeriod=604798 Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.299956 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.346066 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6fd26bdb-868b-49db-9698-e7c79eea5cef-erlang-cookie-secret\") pod \"6fd26bdb-868b-49db-9698-e7c79eea5cef\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.346173 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-server-conf\") pod \"6fd26bdb-868b-49db-9698-e7c79eea5cef\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.346212 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gtln\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-kube-api-access-7gtln\") pod \"6fd26bdb-868b-49db-9698-e7c79eea5cef\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.346299 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-config-data\") pod \"6fd26bdb-868b-49db-9698-e7c79eea5cef\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.346338 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-erlang-cookie\") pod \"6fd26bdb-868b-49db-9698-e7c79eea5cef\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.346365 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"6fd26bdb-868b-49db-9698-e7c79eea5cef\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.346408 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-plugins\") pod \"6fd26bdb-868b-49db-9698-e7c79eea5cef\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.346465 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6fd26bdb-868b-49db-9698-e7c79eea5cef-pod-info\") pod \"6fd26bdb-868b-49db-9698-e7c79eea5cef\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.346659 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-plugins-conf\") pod \"6fd26bdb-868b-49db-9698-e7c79eea5cef\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.346762 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6fd26bdb-868b-49db-9698-e7c79eea5cef" (UID: "6fd26bdb-868b-49db-9698-e7c79eea5cef"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.346782 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6fd26bdb-868b-49db-9698-e7c79eea5cef" (UID: "6fd26bdb-868b-49db-9698-e7c79eea5cef"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.347505 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6fd26bdb-868b-49db-9698-e7c79eea5cef" (UID: "6fd26bdb-868b-49db-9698-e7c79eea5cef"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.347682 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-tls\") pod \"6fd26bdb-868b-49db-9698-e7c79eea5cef\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.348213 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-confd\") pod \"6fd26bdb-868b-49db-9698-e7c79eea5cef\" (UID: \"6fd26bdb-868b-49db-9698-e7c79eea5cef\") " Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.348872 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.348904 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.348921 4835 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.353725 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-kube-api-access-7gtln" (OuterVolumeSpecName: "kube-api-access-7gtln") pod "6fd26bdb-868b-49db-9698-e7c79eea5cef" (UID: "6fd26bdb-868b-49db-9698-e7c79eea5cef"). InnerVolumeSpecName "kube-api-access-7gtln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.354306 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "6fd26bdb-868b-49db-9698-e7c79eea5cef" (UID: "6fd26bdb-868b-49db-9698-e7c79eea5cef"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.360191 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6fd26bdb-868b-49db-9698-e7c79eea5cef" (UID: "6fd26bdb-868b-49db-9698-e7c79eea5cef"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.364461 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6fd26bdb-868b-49db-9698-e7c79eea5cef-pod-info" (OuterVolumeSpecName: "pod-info") pod "6fd26bdb-868b-49db-9698-e7c79eea5cef" (UID: "6fd26bdb-868b-49db-9698-e7c79eea5cef"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.405861 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fd26bdb-868b-49db-9698-e7c79eea5cef-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6fd26bdb-868b-49db-9698-e7c79eea5cef" (UID: "6fd26bdb-868b-49db-9698-e7c79eea5cef"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.416650 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-config-data" (OuterVolumeSpecName: "config-data") pod "6fd26bdb-868b-49db-9698-e7c79eea5cef" (UID: "6fd26bdb-868b-49db-9698-e7c79eea5cef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.445237 4835 generic.go:334] "Generic (PLEG): container finished" podID="6fd26bdb-868b-49db-9698-e7c79eea5cef" containerID="45412f9b1cc503ca1d65287ceb8ce89e2ec70b786e4f3f800c94087f759517e0" exitCode=0 Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.445286 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6fd26bdb-868b-49db-9698-e7c79eea5cef","Type":"ContainerDied","Data":"45412f9b1cc503ca1d65287ceb8ce89e2ec70b786e4f3f800c94087f759517e0"} Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.445316 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6fd26bdb-868b-49db-9698-e7c79eea5cef","Type":"ContainerDied","Data":"129989100f23d84c2dd11a0337445a595bf588f441fd488492ba9b05d193a5fd"} Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.445336 4835 scope.go:117] "RemoveContainer" containerID="45412f9b1cc503ca1d65287ceb8ce89e2ec70b786e4f3f800c94087f759517e0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.445636 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.451724 4835 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6fd26bdb-868b-49db-9698-e7c79eea5cef-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.451758 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gtln\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-kube-api-access-7gtln\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.451770 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.451797 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.451807 4835 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6fd26bdb-868b-49db-9698-e7c79eea5cef-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.451819 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.456870 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-server-conf" (OuterVolumeSpecName: "server-conf") pod "6fd26bdb-868b-49db-9698-e7c79eea5cef" (UID: "6fd26bdb-868b-49db-9698-e7c79eea5cef"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.480008 4835 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.491549 4835 scope.go:117] "RemoveContainer" containerID="82cdd552bd8994820289189a6ad58b7da0d0849b9d8048609dea0a6514c0530d" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.528477 4835 scope.go:117] "RemoveContainer" containerID="45412f9b1cc503ca1d65287ceb8ce89e2ec70b786e4f3f800c94087f759517e0" Oct 03 18:36:13 crc kubenswrapper[4835]: E1003 18:36:13.528884 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45412f9b1cc503ca1d65287ceb8ce89e2ec70b786e4f3f800c94087f759517e0\": container with ID starting with 45412f9b1cc503ca1d65287ceb8ce89e2ec70b786e4f3f800c94087f759517e0 not found: ID does not exist" containerID="45412f9b1cc503ca1d65287ceb8ce89e2ec70b786e4f3f800c94087f759517e0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.528913 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45412f9b1cc503ca1d65287ceb8ce89e2ec70b786e4f3f800c94087f759517e0"} err="failed to get container status \"45412f9b1cc503ca1d65287ceb8ce89e2ec70b786e4f3f800c94087f759517e0\": rpc error: code = NotFound desc = could not find container \"45412f9b1cc503ca1d65287ceb8ce89e2ec70b786e4f3f800c94087f759517e0\": container with ID starting with 45412f9b1cc503ca1d65287ceb8ce89e2ec70b786e4f3f800c94087f759517e0 not found: ID does not exist" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.528938 4835 scope.go:117] "RemoveContainer" containerID="82cdd552bd8994820289189a6ad58b7da0d0849b9d8048609dea0a6514c0530d" Oct 03 18:36:13 crc kubenswrapper[4835]: E1003 18:36:13.536603 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82cdd552bd8994820289189a6ad58b7da0d0849b9d8048609dea0a6514c0530d\": container with ID starting with 82cdd552bd8994820289189a6ad58b7da0d0849b9d8048609dea0a6514c0530d not found: ID does not exist" containerID="82cdd552bd8994820289189a6ad58b7da0d0849b9d8048609dea0a6514c0530d" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.536644 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cdd552bd8994820289189a6ad58b7da0d0849b9d8048609dea0a6514c0530d"} err="failed to get container status \"82cdd552bd8994820289189a6ad58b7da0d0849b9d8048609dea0a6514c0530d\": rpc error: code = NotFound desc = could not find container \"82cdd552bd8994820289189a6ad58b7da0d0849b9d8048609dea0a6514c0530d\": container with ID starting with 82cdd552bd8994820289189a6ad58b7da0d0849b9d8048609dea0a6514c0530d not found: ID does not exist" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.553654 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6fd26bdb-868b-49db-9698-e7c79eea5cef" (UID: "6fd26bdb-868b-49db-9698-e7c79eea5cef"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.554082 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6fd26bdb-868b-49db-9698-e7c79eea5cef-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.554102 4835 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6fd26bdb-868b-49db-9698-e7c79eea5cef-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.554126 4835 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.782861 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.809463 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.824174 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 18:36:13 crc kubenswrapper[4835]: E1003 18:36:13.824629 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd26bdb-868b-49db-9698-e7c79eea5cef" containerName="setup-container" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.824650 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd26bdb-868b-49db-9698-e7c79eea5cef" containerName="setup-container" Oct 03 18:36:13 crc kubenswrapper[4835]: E1003 18:36:13.824659 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd26bdb-868b-49db-9698-e7c79eea5cef" containerName="rabbitmq" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.824665 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd26bdb-868b-49db-9698-e7c79eea5cef" containerName="rabbitmq" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.824861 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd26bdb-868b-49db-9698-e7c79eea5cef" containerName="rabbitmq" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.826015 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.831560 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.836233 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.836332 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.836425 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.836525 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.836634 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-4cvq6" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.836771 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.838891 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.966828 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38b03498-2a1a-4e93-993a-009b39463f69-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.966876 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38b03498-2a1a-4e93-993a-009b39463f69-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.967027 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38b03498-2a1a-4e93-993a-009b39463f69-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.967258 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38b03498-2a1a-4e93-993a-009b39463f69-config-data\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.967348 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38b03498-2a1a-4e93-993a-009b39463f69-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.967399 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtjhw\" (UniqueName: \"kubernetes.io/projected/38b03498-2a1a-4e93-993a-009b39463f69-kube-api-access-xtjhw\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.967619 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38b03498-2a1a-4e93-993a-009b39463f69-server-conf\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.967670 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38b03498-2a1a-4e93-993a-009b39463f69-pod-info\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.967855 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.967924 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38b03498-2a1a-4e93-993a-009b39463f69-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:13 crc kubenswrapper[4835]: I1003 18:36:13.967991 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38b03498-2a1a-4e93-993a-009b39463f69-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.069857 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38b03498-2a1a-4e93-993a-009b39463f69-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.069931 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38b03498-2a1a-4e93-993a-009b39463f69-config-data\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.069968 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38b03498-2a1a-4e93-993a-009b39463f69-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.069998 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtjhw\" (UniqueName: \"kubernetes.io/projected/38b03498-2a1a-4e93-993a-009b39463f69-kube-api-access-xtjhw\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.070055 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38b03498-2a1a-4e93-993a-009b39463f69-server-conf\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.070092 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38b03498-2a1a-4e93-993a-009b39463f69-pod-info\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.070117 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.070143 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38b03498-2a1a-4e93-993a-009b39463f69-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.070169 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38b03498-2a1a-4e93-993a-009b39463f69-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.070210 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38b03498-2a1a-4e93-993a-009b39463f69-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.070228 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38b03498-2a1a-4e93-993a-009b39463f69-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.070432 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.070991 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/38b03498-2a1a-4e93-993a-009b39463f69-config-data\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.071200 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/38b03498-2a1a-4e93-993a-009b39463f69-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.071360 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/38b03498-2a1a-4e93-993a-009b39463f69-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.071370 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/38b03498-2a1a-4e93-993a-009b39463f69-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.071994 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/38b03498-2a1a-4e93-993a-009b39463f69-server-conf\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.074977 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/38b03498-2a1a-4e93-993a-009b39463f69-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.081603 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/38b03498-2a1a-4e93-993a-009b39463f69-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.082504 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/38b03498-2a1a-4e93-993a-009b39463f69-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.083286 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/38b03498-2a1a-4e93-993a-009b39463f69-pod-info\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.091704 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtjhw\" (UniqueName: \"kubernetes.io/projected/38b03498-2a1a-4e93-993a-009b39463f69-kube-api-access-xtjhw\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.133306 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"38b03498-2a1a-4e93-993a-009b39463f69\") " pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.207903 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.319784 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.377936 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"2f5f99aa-dba6-465b-866a-1e293ba51685\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.378124 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-erlang-cookie\") pod \"2f5f99aa-dba6-465b-866a-1e293ba51685\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.378194 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2f5f99aa-dba6-465b-866a-1e293ba51685-pod-info\") pod \"2f5f99aa-dba6-465b-866a-1e293ba51685\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.378228 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-config-data\") pod \"2f5f99aa-dba6-465b-866a-1e293ba51685\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.378254 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-tls\") pod \"2f5f99aa-dba6-465b-866a-1e293ba51685\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.378288 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2f5f99aa-dba6-465b-866a-1e293ba51685-erlang-cookie-secret\") pod \"2f5f99aa-dba6-465b-866a-1e293ba51685\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.378307 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-server-conf\") pod \"2f5f99aa-dba6-465b-866a-1e293ba51685\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.378344 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-plugins-conf\") pod \"2f5f99aa-dba6-465b-866a-1e293ba51685\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.378362 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-plugins\") pod \"2f5f99aa-dba6-465b-866a-1e293ba51685\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.378384 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq47v\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-kube-api-access-tq47v\") pod \"2f5f99aa-dba6-465b-866a-1e293ba51685\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.378413 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-confd\") pod \"2f5f99aa-dba6-465b-866a-1e293ba51685\" (UID: \"2f5f99aa-dba6-465b-866a-1e293ba51685\") " Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.380658 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2f5f99aa-dba6-465b-866a-1e293ba51685" (UID: "2f5f99aa-dba6-465b-866a-1e293ba51685"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.380737 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2f5f99aa-dba6-465b-866a-1e293ba51685" (UID: "2f5f99aa-dba6-465b-866a-1e293ba51685"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.384868 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2f5f99aa-dba6-465b-866a-1e293ba51685" (UID: "2f5f99aa-dba6-465b-866a-1e293ba51685"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.387343 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "2f5f99aa-dba6-465b-866a-1e293ba51685" (UID: "2f5f99aa-dba6-465b-866a-1e293ba51685"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.388640 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2f5f99aa-dba6-465b-866a-1e293ba51685" (UID: "2f5f99aa-dba6-465b-866a-1e293ba51685"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.391299 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2f5f99aa-dba6-465b-866a-1e293ba51685-pod-info" (OuterVolumeSpecName: "pod-info") pod "2f5f99aa-dba6-465b-866a-1e293ba51685" (UID: "2f5f99aa-dba6-465b-866a-1e293ba51685"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.393391 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-kube-api-access-tq47v" (OuterVolumeSpecName: "kube-api-access-tq47v") pod "2f5f99aa-dba6-465b-866a-1e293ba51685" (UID: "2f5f99aa-dba6-465b-866a-1e293ba51685"). InnerVolumeSpecName "kube-api-access-tq47v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.394139 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f5f99aa-dba6-465b-866a-1e293ba51685-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2f5f99aa-dba6-465b-866a-1e293ba51685" (UID: "2f5f99aa-dba6-465b-866a-1e293ba51685"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.423548 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-config-data" (OuterVolumeSpecName: "config-data") pod "2f5f99aa-dba6-465b-866a-1e293ba51685" (UID: "2f5f99aa-dba6-465b-866a-1e293ba51685"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.440668 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-server-conf" (OuterVolumeSpecName: "server-conf") pod "2f5f99aa-dba6-465b-866a-1e293ba51685" (UID: "2f5f99aa-dba6-465b-866a-1e293ba51685"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.455709 4835 generic.go:334] "Generic (PLEG): container finished" podID="2f5f99aa-dba6-465b-866a-1e293ba51685" containerID="daa7a9cd848675475ad1e8821bcf022f7f368dccbd8b463091be4977cebba766" exitCode=0 Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.455803 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2f5f99aa-dba6-465b-866a-1e293ba51685","Type":"ContainerDied","Data":"daa7a9cd848675475ad1e8821bcf022f7f368dccbd8b463091be4977cebba766"} Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.455830 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2f5f99aa-dba6-465b-866a-1e293ba51685","Type":"ContainerDied","Data":"4d745c208d96515f2f2b4d897bbf4fa83a87f8e897ea2c8b2133d9c30f4e826d"} Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.455846 4835 scope.go:117] "RemoveContainer" containerID="daa7a9cd848675475ad1e8821bcf022f7f368dccbd8b463091be4977cebba766" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.455990 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.475376 4835 scope.go:117] "RemoveContainer" containerID="6073b92c5fb7502902c8d7c4767ef2d79902b9a0b6bebdbfdacd3f52f2b31641" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.480871 4835 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2f5f99aa-dba6-465b-866a-1e293ba51685-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.480900 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.480908 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.480916 4835 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2f5f99aa-dba6-465b-866a-1e293ba51685-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.480926 4835 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.480934 4835 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2f5f99aa-dba6-465b-866a-1e293ba51685-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.480945 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.480954 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq47v\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-kube-api-access-tq47v\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.480991 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.481003 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.495055 4835 scope.go:117] "RemoveContainer" containerID="daa7a9cd848675475ad1e8821bcf022f7f368dccbd8b463091be4977cebba766" Oct 03 18:36:14 crc kubenswrapper[4835]: E1003 18:36:14.495767 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa7a9cd848675475ad1e8821bcf022f7f368dccbd8b463091be4977cebba766\": container with ID starting with daa7a9cd848675475ad1e8821bcf022f7f368dccbd8b463091be4977cebba766 not found: ID does not exist" containerID="daa7a9cd848675475ad1e8821bcf022f7f368dccbd8b463091be4977cebba766" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.495802 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa7a9cd848675475ad1e8821bcf022f7f368dccbd8b463091be4977cebba766"} err="failed to get container status \"daa7a9cd848675475ad1e8821bcf022f7f368dccbd8b463091be4977cebba766\": rpc error: code = NotFound desc = could not find container \"daa7a9cd848675475ad1e8821bcf022f7f368dccbd8b463091be4977cebba766\": container with ID starting with daa7a9cd848675475ad1e8821bcf022f7f368dccbd8b463091be4977cebba766 not found: ID does not exist" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.495829 4835 scope.go:117] "RemoveContainer" containerID="6073b92c5fb7502902c8d7c4767ef2d79902b9a0b6bebdbfdacd3f52f2b31641" Oct 03 18:36:14 crc kubenswrapper[4835]: E1003 18:36:14.496321 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6073b92c5fb7502902c8d7c4767ef2d79902b9a0b6bebdbfdacd3f52f2b31641\": container with ID starting with 6073b92c5fb7502902c8d7c4767ef2d79902b9a0b6bebdbfdacd3f52f2b31641 not found: ID does not exist" containerID="6073b92c5fb7502902c8d7c4767ef2d79902b9a0b6bebdbfdacd3f52f2b31641" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.496367 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6073b92c5fb7502902c8d7c4767ef2d79902b9a0b6bebdbfdacd3f52f2b31641"} err="failed to get container status \"6073b92c5fb7502902c8d7c4767ef2d79902b9a0b6bebdbfdacd3f52f2b31641\": rpc error: code = NotFound desc = could not find container \"6073b92c5fb7502902c8d7c4767ef2d79902b9a0b6bebdbfdacd3f52f2b31641\": container with ID starting with 6073b92c5fb7502902c8d7c4767ef2d79902b9a0b6bebdbfdacd3f52f2b31641 not found: ID does not exist" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.500996 4835 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.534814 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2f5f99aa-dba6-465b-866a-1e293ba51685" (UID: "2f5f99aa-dba6-465b-866a-1e293ba51685"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.583215 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2f5f99aa-dba6-465b-866a-1e293ba51685-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.583284 4835 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.658085 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.792995 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.802217 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.815178 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 18:36:14 crc kubenswrapper[4835]: E1003 18:36:14.815677 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f5f99aa-dba6-465b-866a-1e293ba51685" containerName="setup-container" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.815693 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5f99aa-dba6-465b-866a-1e293ba51685" containerName="setup-container" Oct 03 18:36:14 crc kubenswrapper[4835]: E1003 18:36:14.815721 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f5f99aa-dba6-465b-866a-1e293ba51685" containerName="rabbitmq" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.815730 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5f99aa-dba6-465b-866a-1e293ba51685" containerName="rabbitmq" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.816031 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f5f99aa-dba6-465b-866a-1e293ba51685" containerName="rabbitmq" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.819293 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.822257 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.822330 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.822339 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-gltwr" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.822487 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.822851 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.822859 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.824555 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.826421 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.893138 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53873352-044d-4511-b474-6da275dc856e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.893371 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53873352-044d-4511-b474-6da275dc856e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.893481 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53873352-044d-4511-b474-6da275dc856e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.893560 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7bh4\" (UniqueName: \"kubernetes.io/projected/53873352-044d-4511-b474-6da275dc856e-kube-api-access-m7bh4\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.893633 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53873352-044d-4511-b474-6da275dc856e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.893747 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53873352-044d-4511-b474-6da275dc856e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.893818 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53873352-044d-4511-b474-6da275dc856e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.893941 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.894034 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53873352-044d-4511-b474-6da275dc856e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.894297 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53873352-044d-4511-b474-6da275dc856e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.894385 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53873352-044d-4511-b474-6da275dc856e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.894659 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f5f99aa-dba6-465b-866a-1e293ba51685" path="/var/lib/kubelet/pods/2f5f99aa-dba6-465b-866a-1e293ba51685/volumes" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.895438 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd26bdb-868b-49db-9698-e7c79eea5cef" path="/var/lib/kubelet/pods/6fd26bdb-868b-49db-9698-e7c79eea5cef/volumes" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.995956 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.996046 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53873352-044d-4511-b474-6da275dc856e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.996112 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53873352-044d-4511-b474-6da275dc856e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.996158 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53873352-044d-4511-b474-6da275dc856e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.996215 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53873352-044d-4511-b474-6da275dc856e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.996276 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53873352-044d-4511-b474-6da275dc856e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.996304 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53873352-044d-4511-b474-6da275dc856e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.996364 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7bh4\" (UniqueName: \"kubernetes.io/projected/53873352-044d-4511-b474-6da275dc856e-kube-api-access-m7bh4\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.996399 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53873352-044d-4511-b474-6da275dc856e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.996449 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53873352-044d-4511-b474-6da275dc856e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.996476 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53873352-044d-4511-b474-6da275dc856e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.997094 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53873352-044d-4511-b474-6da275dc856e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:14 crc kubenswrapper[4835]: I1003 18:36:14.998250 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:15 crc kubenswrapper[4835]: I1003 18:36:15.003166 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53873352-044d-4511-b474-6da275dc856e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:15 crc kubenswrapper[4835]: I1003 18:36:15.003466 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53873352-044d-4511-b474-6da275dc856e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:15 crc kubenswrapper[4835]: I1003 18:36:15.004321 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53873352-044d-4511-b474-6da275dc856e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:15 crc kubenswrapper[4835]: I1003 18:36:15.005324 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53873352-044d-4511-b474-6da275dc856e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:15 crc kubenswrapper[4835]: I1003 18:36:15.005381 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53873352-044d-4511-b474-6da275dc856e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:15 crc kubenswrapper[4835]: I1003 18:36:15.005438 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53873352-044d-4511-b474-6da275dc856e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:15 crc kubenswrapper[4835]: I1003 18:36:15.005552 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53873352-044d-4511-b474-6da275dc856e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:15 crc kubenswrapper[4835]: I1003 18:36:15.006621 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53873352-044d-4511-b474-6da275dc856e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:15 crc kubenswrapper[4835]: I1003 18:36:15.021090 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7bh4\" (UniqueName: \"kubernetes.io/projected/53873352-044d-4511-b474-6da275dc856e-kube-api-access-m7bh4\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:15 crc kubenswrapper[4835]: I1003 18:36:15.040995 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"53873352-044d-4511-b474-6da275dc856e\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:15 crc kubenswrapper[4835]: I1003 18:36:15.138927 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:15 crc kubenswrapper[4835]: I1003 18:36:15.468201 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38b03498-2a1a-4e93-993a-009b39463f69","Type":"ContainerStarted","Data":"f1b6af3320a556b887a873ee9ceb817667eb7c752f347b79b9975e3da0704754"} Oct 03 18:36:15 crc kubenswrapper[4835]: I1003 18:36:15.619437 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 18:36:16 crc kubenswrapper[4835]: I1003 18:36:16.480633 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"53873352-044d-4511-b474-6da275dc856e","Type":"ContainerStarted","Data":"51e3db0ea9b8d909687813e541b24865ad36934990c948a1e1969efef1b34a0c"} Oct 03 18:36:16 crc kubenswrapper[4835]: I1003 18:36:16.483223 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38b03498-2a1a-4e93-993a-009b39463f69","Type":"ContainerStarted","Data":"bbb713d871aa4e1ab6670a6f27efde7dffcbed7aa0ca276232ee91cc610f0d08"} Oct 03 18:36:17 crc kubenswrapper[4835]: I1003 18:36:17.497639 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"53873352-044d-4511-b474-6da275dc856e","Type":"ContainerStarted","Data":"6b5c49400fa418a1c94285b87d70f7dfb988346b6c2a0e9013d8d8a3ea6a5c99"} Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.780628 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86d4475dc7-45rrv"] Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.782950 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.785634 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.794232 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d4475dc7-45rrv"] Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.860064 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-dns-swift-storage-0\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.860131 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-ovsdbserver-sb\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.860180 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-config\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.860215 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-openstack-edpm-ipam\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.860248 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-ovsdbserver-nb\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.860290 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snmg4\" (UniqueName: \"kubernetes.io/projected/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-kube-api-access-snmg4\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.860435 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-dns-svc\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.962024 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-config\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.962107 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-openstack-edpm-ipam\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.962147 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-ovsdbserver-nb\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.962209 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snmg4\" (UniqueName: \"kubernetes.io/projected/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-kube-api-access-snmg4\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.962259 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-dns-svc\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.962335 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-dns-swift-storage-0\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.962385 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-ovsdbserver-sb\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.962858 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-config\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.963273 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-ovsdbserver-nb\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.963364 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-dns-svc\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.963498 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-ovsdbserver-sb\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.963554 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-dns-swift-storage-0\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.963931 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-openstack-edpm-ipam\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:23 crc kubenswrapper[4835]: I1003 18:36:23.983949 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snmg4\" (UniqueName: \"kubernetes.io/projected/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-kube-api-access-snmg4\") pod \"dnsmasq-dns-86d4475dc7-45rrv\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:24 crc kubenswrapper[4835]: I1003 18:36:24.102530 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:24 crc kubenswrapper[4835]: W1003 18:36:24.589862 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71a94ea8_dadc_4f3f_bd9d_b07a26adbb83.slice/crio-f5823bea8e6abdf7c45a5f9fe186b61312fed429693aa18a53fb380547e6ea0c WatchSource:0}: Error finding container f5823bea8e6abdf7c45a5f9fe186b61312fed429693aa18a53fb380547e6ea0c: Status 404 returned error can't find the container with id f5823bea8e6abdf7c45a5f9fe186b61312fed429693aa18a53fb380547e6ea0c Oct 03 18:36:24 crc kubenswrapper[4835]: I1003 18:36:24.592937 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d4475dc7-45rrv"] Oct 03 18:36:25 crc kubenswrapper[4835]: I1003 18:36:25.574636 4835 generic.go:334] "Generic (PLEG): container finished" podID="71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" containerID="97712052dd964589abfd5ccf0246be143c88d65395f811cc01600e01b38a821d" exitCode=0 Oct 03 18:36:25 crc kubenswrapper[4835]: I1003 18:36:25.574690 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" event={"ID":"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83","Type":"ContainerDied","Data":"97712052dd964589abfd5ccf0246be143c88d65395f811cc01600e01b38a821d"} Oct 03 18:36:25 crc kubenswrapper[4835]: I1003 18:36:25.575024 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" event={"ID":"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83","Type":"ContainerStarted","Data":"f5823bea8e6abdf7c45a5f9fe186b61312fed429693aa18a53fb380547e6ea0c"} Oct 03 18:36:26 crc kubenswrapper[4835]: I1003 18:36:26.585204 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" event={"ID":"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83","Type":"ContainerStarted","Data":"5453e9aaaadabaaf82c27c908c8674deeea07207e18a4a9f23e79b5c714cb024"} Oct 03 18:36:26 crc kubenswrapper[4835]: I1003 18:36:26.585514 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:26 crc kubenswrapper[4835]: I1003 18:36:26.613840 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" podStartSLOduration=3.613820233 podStartE2EDuration="3.613820233s" podCreationTimestamp="2025-10-03 18:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:36:26.604057978 +0000 UTC m=+1328.319998840" watchObservedRunningTime="2025-10-03 18:36:26.613820233 +0000 UTC m=+1328.329761115" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.104347 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.154846 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c84bdb669-6frp9"] Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.155324 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" podUID="7729606d-e769-4a52-9ebc-830f6a79d3ff" containerName="dnsmasq-dns" containerID="cri-o://d3f58fdab93f62baa6af045ff1a0c57a51e2e37dc61cdde574d4ff5feb2a0b15" gracePeriod=10 Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.323128 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f444fb569-rpskt"] Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.324948 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.346403 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f444fb569-rpskt"] Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.477386 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-dns-swift-storage-0\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.477621 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-ovsdbserver-nb\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.477695 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-ovsdbserver-sb\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.477879 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-dns-svc\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.477910 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-config\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.477934 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bb7v\" (UniqueName: \"kubernetes.io/projected/c89c5f86-48a0-4bc7-9052-806376312506-kube-api-access-4bb7v\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.477999 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-openstack-edpm-ipam\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.579487 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-ovsdbserver-sb\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.579600 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-dns-svc\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.579621 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-config\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.579640 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bb7v\" (UniqueName: \"kubernetes.io/projected/c89c5f86-48a0-4bc7-9052-806376312506-kube-api-access-4bb7v\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.579665 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-openstack-edpm-ipam\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.579724 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-dns-swift-storage-0\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.579738 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-ovsdbserver-nb\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.580575 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-ovsdbserver-nb\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.581441 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-ovsdbserver-sb\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.581942 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-dns-svc\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.582483 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-config\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.583764 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-openstack-edpm-ipam\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.584672 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c89c5f86-48a0-4bc7-9052-806376312506-dns-swift-storage-0\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.609141 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bb7v\" (UniqueName: \"kubernetes.io/projected/c89c5f86-48a0-4bc7-9052-806376312506-kube-api-access-4bb7v\") pod \"dnsmasq-dns-5f444fb569-rpskt\" (UID: \"c89c5f86-48a0-4bc7-9052-806376312506\") " pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.646236 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.687602 4835 generic.go:334] "Generic (PLEG): container finished" podID="7729606d-e769-4a52-9ebc-830f6a79d3ff" containerID="d3f58fdab93f62baa6af045ff1a0c57a51e2e37dc61cdde574d4ff5feb2a0b15" exitCode=0 Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.687644 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" event={"ID":"7729606d-e769-4a52-9ebc-830f6a79d3ff","Type":"ContainerDied","Data":"d3f58fdab93f62baa6af045ff1a0c57a51e2e37dc61cdde574d4ff5feb2a0b15"} Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.687668 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" event={"ID":"7729606d-e769-4a52-9ebc-830f6a79d3ff","Type":"ContainerDied","Data":"0c924cfd5e69dd92b6c09b4bd9541dd1b5aa23d17d9f2eeddd23e09e387832c8"} Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.687708 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c924cfd5e69dd92b6c09b4bd9541dd1b5aa23d17d9f2eeddd23e09e387832c8" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.749838 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.887390 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-dns-svc\") pod \"7729606d-e769-4a52-9ebc-830f6a79d3ff\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.887586 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-dns-swift-storage-0\") pod \"7729606d-e769-4a52-9ebc-830f6a79d3ff\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.887621 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-ovsdbserver-nb\") pod \"7729606d-e769-4a52-9ebc-830f6a79d3ff\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.887670 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-ovsdbserver-sb\") pod \"7729606d-e769-4a52-9ebc-830f6a79d3ff\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.887704 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-config\") pod \"7729606d-e769-4a52-9ebc-830f6a79d3ff\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.887761 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg88n\" (UniqueName: \"kubernetes.io/projected/7729606d-e769-4a52-9ebc-830f6a79d3ff-kube-api-access-lg88n\") pod \"7729606d-e769-4a52-9ebc-830f6a79d3ff\" (UID: \"7729606d-e769-4a52-9ebc-830f6a79d3ff\") " Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.892789 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7729606d-e769-4a52-9ebc-830f6a79d3ff-kube-api-access-lg88n" (OuterVolumeSpecName: "kube-api-access-lg88n") pod "7729606d-e769-4a52-9ebc-830f6a79d3ff" (UID: "7729606d-e769-4a52-9ebc-830f6a79d3ff"). InnerVolumeSpecName "kube-api-access-lg88n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.975487 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7729606d-e769-4a52-9ebc-830f6a79d3ff" (UID: "7729606d-e769-4a52-9ebc-830f6a79d3ff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.981154 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7729606d-e769-4a52-9ebc-830f6a79d3ff" (UID: "7729606d-e769-4a52-9ebc-830f6a79d3ff"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.992657 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.992691 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:34 crc kubenswrapper[4835]: I1003 18:36:34.992700 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg88n\" (UniqueName: \"kubernetes.io/projected/7729606d-e769-4a52-9ebc-830f6a79d3ff-kube-api-access-lg88n\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.000057 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7729606d-e769-4a52-9ebc-830f6a79d3ff" (UID: "7729606d-e769-4a52-9ebc-830f6a79d3ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.005503 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7729606d-e769-4a52-9ebc-830f6a79d3ff" (UID: "7729606d-e769-4a52-9ebc-830f6a79d3ff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.014575 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-config" (OuterVolumeSpecName: "config") pod "7729606d-e769-4a52-9ebc-830f6a79d3ff" (UID: "7729606d-e769-4a52-9ebc-830f6a79d3ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:35 crc kubenswrapper[4835]: W1003 18:36:35.086528 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc89c5f86_48a0_4bc7_9052_806376312506.slice/crio-4691a59705b588a3cfa57d6974d99ccc39866ebbf1e1f10a5e2e36ab3501c17e WatchSource:0}: Error finding container 4691a59705b588a3cfa57d6974d99ccc39866ebbf1e1f10a5e2e36ab3501c17e: Status 404 returned error can't find the container with id 4691a59705b588a3cfa57d6974d99ccc39866ebbf1e1f10a5e2e36ab3501c17e Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.086878 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f444fb569-rpskt"] Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.094269 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.094310 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.094323 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7729606d-e769-4a52-9ebc-830f6a79d3ff-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.358250 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.358299 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.700199 4835 generic.go:334] "Generic (PLEG): container finished" podID="c89c5f86-48a0-4bc7-9052-806376312506" containerID="5a8f10216abd2d8034835778797fb2c4df8ee1501ded7ff81a94fac20085a7f0" exitCode=0 Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.700466 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.702143 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f444fb569-rpskt" event={"ID":"c89c5f86-48a0-4bc7-9052-806376312506","Type":"ContainerDied","Data":"5a8f10216abd2d8034835778797fb2c4df8ee1501ded7ff81a94fac20085a7f0"} Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.702206 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f444fb569-rpskt" event={"ID":"c89c5f86-48a0-4bc7-9052-806376312506","Type":"ContainerStarted","Data":"4691a59705b588a3cfa57d6974d99ccc39866ebbf1e1f10a5e2e36ab3501c17e"} Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.957261 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c84bdb669-6frp9"] Oct 03 18:36:35 crc kubenswrapper[4835]: I1003 18:36:35.970014 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c84bdb669-6frp9"] Oct 03 18:36:36 crc kubenswrapper[4835]: I1003 18:36:36.710988 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f444fb569-rpskt" event={"ID":"c89c5f86-48a0-4bc7-9052-806376312506","Type":"ContainerStarted","Data":"7bf833517987f8a9f55915f6155648187e19c84bc004b3a72cd09fde10c57905"} Oct 03 18:36:36 crc kubenswrapper[4835]: I1003 18:36:36.711106 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:36 crc kubenswrapper[4835]: I1003 18:36:36.732560 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f444fb569-rpskt" podStartSLOduration=2.732537404 podStartE2EDuration="2.732537404s" podCreationTimestamp="2025-10-03 18:36:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:36:36.72833373 +0000 UTC m=+1338.444274602" watchObservedRunningTime="2025-10-03 18:36:36.732537404 +0000 UTC m=+1338.448478276" Oct 03 18:36:36 crc kubenswrapper[4835]: I1003 18:36:36.886817 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7729606d-e769-4a52-9ebc-830f6a79d3ff" path="/var/lib/kubelet/pods/7729606d-e769-4a52-9ebc-830f6a79d3ff/volumes" Oct 03 18:36:39 crc kubenswrapper[4835]: I1003 18:36:39.412749 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6c84bdb669-6frp9" podUID="7729606d-e769-4a52-9ebc-830f6a79d3ff" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.221:5353: i/o timeout" Oct 03 18:36:44 crc kubenswrapper[4835]: I1003 18:36:44.648308 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f444fb569-rpskt" Oct 03 18:36:44 crc kubenswrapper[4835]: I1003 18:36:44.710642 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d4475dc7-45rrv"] Oct 03 18:36:44 crc kubenswrapper[4835]: I1003 18:36:44.710914 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" podUID="71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" containerName="dnsmasq-dns" containerID="cri-o://5453e9aaaadabaaf82c27c908c8674deeea07207e18a4a9f23e79b5c714cb024" gracePeriod=10 Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.395527 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.413892 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-ovsdbserver-sb\") pod \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.414057 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snmg4\" (UniqueName: \"kubernetes.io/projected/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-kube-api-access-snmg4\") pod \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.414119 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-dns-swift-storage-0\") pod \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.414237 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-dns-svc\") pod \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.414324 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-config\") pod \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.414352 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-openstack-edpm-ipam\") pod \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.414369 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-ovsdbserver-nb\") pod \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\" (UID: \"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83\") " Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.420581 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-kube-api-access-snmg4" (OuterVolumeSpecName: "kube-api-access-snmg4") pod "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" (UID: "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83"). InnerVolumeSpecName "kube-api-access-snmg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.476739 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" (UID: "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.490529 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" (UID: "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.503388 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-config" (OuterVolumeSpecName: "config") pod "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" (UID: "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.516528 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-config\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.516715 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.516802 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snmg4\" (UniqueName: \"kubernetes.io/projected/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-kube-api-access-snmg4\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.516862 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.518272 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" (UID: "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.527823 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" (UID: "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.529521 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" (UID: "71a94ea8-dadc-4f3f-bd9d-b07a26adbb83"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.618977 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.619017 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.619030 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.787247 4835 generic.go:334] "Generic (PLEG): container finished" podID="71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" containerID="5453e9aaaadabaaf82c27c908c8674deeea07207e18a4a9f23e79b5c714cb024" exitCode=0 Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.787300 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" event={"ID":"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83","Type":"ContainerDied","Data":"5453e9aaaadabaaf82c27c908c8674deeea07207e18a4a9f23e79b5c714cb024"} Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.787332 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" event={"ID":"71a94ea8-dadc-4f3f-bd9d-b07a26adbb83","Type":"ContainerDied","Data":"f5823bea8e6abdf7c45a5f9fe186b61312fed429693aa18a53fb380547e6ea0c"} Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.787352 4835 scope.go:117] "RemoveContainer" containerID="5453e9aaaadabaaf82c27c908c8674deeea07207e18a4a9f23e79b5c714cb024" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.787513 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d4475dc7-45rrv" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.819183 4835 scope.go:117] "RemoveContainer" containerID="97712052dd964589abfd5ccf0246be143c88d65395f811cc01600e01b38a821d" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.831235 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d4475dc7-45rrv"] Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.841330 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86d4475dc7-45rrv"] Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.848584 4835 scope.go:117] "RemoveContainer" containerID="5453e9aaaadabaaf82c27c908c8674deeea07207e18a4a9f23e79b5c714cb024" Oct 03 18:36:45 crc kubenswrapper[4835]: E1003 18:36:45.849104 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5453e9aaaadabaaf82c27c908c8674deeea07207e18a4a9f23e79b5c714cb024\": container with ID starting with 5453e9aaaadabaaf82c27c908c8674deeea07207e18a4a9f23e79b5c714cb024 not found: ID does not exist" containerID="5453e9aaaadabaaf82c27c908c8674deeea07207e18a4a9f23e79b5c714cb024" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.849150 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5453e9aaaadabaaf82c27c908c8674deeea07207e18a4a9f23e79b5c714cb024"} err="failed to get container status \"5453e9aaaadabaaf82c27c908c8674deeea07207e18a4a9f23e79b5c714cb024\": rpc error: code = NotFound desc = could not find container \"5453e9aaaadabaaf82c27c908c8674deeea07207e18a4a9f23e79b5c714cb024\": container with ID starting with 5453e9aaaadabaaf82c27c908c8674deeea07207e18a4a9f23e79b5c714cb024 not found: ID does not exist" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.849181 4835 scope.go:117] "RemoveContainer" containerID="97712052dd964589abfd5ccf0246be143c88d65395f811cc01600e01b38a821d" Oct 03 18:36:45 crc kubenswrapper[4835]: E1003 18:36:45.849630 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97712052dd964589abfd5ccf0246be143c88d65395f811cc01600e01b38a821d\": container with ID starting with 97712052dd964589abfd5ccf0246be143c88d65395f811cc01600e01b38a821d not found: ID does not exist" containerID="97712052dd964589abfd5ccf0246be143c88d65395f811cc01600e01b38a821d" Oct 03 18:36:45 crc kubenswrapper[4835]: I1003 18:36:45.849691 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97712052dd964589abfd5ccf0246be143c88d65395f811cc01600e01b38a821d"} err="failed to get container status \"97712052dd964589abfd5ccf0246be143c88d65395f811cc01600e01b38a821d\": rpc error: code = NotFound desc = could not find container \"97712052dd964589abfd5ccf0246be143c88d65395f811cc01600e01b38a821d\": container with ID starting with 97712052dd964589abfd5ccf0246be143c88d65395f811cc01600e01b38a821d not found: ID does not exist" Oct 03 18:36:46 crc kubenswrapper[4835]: I1003 18:36:46.900011 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" path="/var/lib/kubelet/pods/71a94ea8-dadc-4f3f-bd9d-b07a26adbb83/volumes" Oct 03 18:36:48 crc kubenswrapper[4835]: I1003 18:36:48.832136 4835 generic.go:334] "Generic (PLEG): container finished" podID="38b03498-2a1a-4e93-993a-009b39463f69" containerID="bbb713d871aa4e1ab6670a6f27efde7dffcbed7aa0ca276232ee91cc610f0d08" exitCode=0 Oct 03 18:36:48 crc kubenswrapper[4835]: I1003 18:36:48.832214 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38b03498-2a1a-4e93-993a-009b39463f69","Type":"ContainerDied","Data":"bbb713d871aa4e1ab6670a6f27efde7dffcbed7aa0ca276232ee91cc610f0d08"} Oct 03 18:36:49 crc kubenswrapper[4835]: I1003 18:36:49.846428 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"38b03498-2a1a-4e93-993a-009b39463f69","Type":"ContainerStarted","Data":"b3ee1b0e5f0bb24022cb58cbf88bbb470a6f7bb22368e9d40ef0a2c563339b20"} Oct 03 18:36:49 crc kubenswrapper[4835]: I1003 18:36:49.847022 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 18:36:49 crc kubenswrapper[4835]: I1003 18:36:49.848887 4835 generic.go:334] "Generic (PLEG): container finished" podID="53873352-044d-4511-b474-6da275dc856e" containerID="6b5c49400fa418a1c94285b87d70f7dfb988346b6c2a0e9013d8d8a3ea6a5c99" exitCode=0 Oct 03 18:36:49 crc kubenswrapper[4835]: I1003 18:36:49.848946 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"53873352-044d-4511-b474-6da275dc856e","Type":"ContainerDied","Data":"6b5c49400fa418a1c94285b87d70f7dfb988346b6c2a0e9013d8d8a3ea6a5c99"} Oct 03 18:36:49 crc kubenswrapper[4835]: I1003 18:36:49.882231 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.882207884 podStartE2EDuration="36.882207884s" podCreationTimestamp="2025-10-03 18:36:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:36:49.866840293 +0000 UTC m=+1351.582781185" watchObservedRunningTime="2025-10-03 18:36:49.882207884 +0000 UTC m=+1351.598148756" Oct 03 18:36:50 crc kubenswrapper[4835]: I1003 18:36:50.866595 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"53873352-044d-4511-b474-6da275dc856e","Type":"ContainerStarted","Data":"84863ba820e5d1a8bd598f458beeb813536d0d5d4ab64ea5411d4692d102d704"} Oct 03 18:36:50 crc kubenswrapper[4835]: I1003 18:36:50.901085 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.901044245 podStartE2EDuration="36.901044245s" podCreationTimestamp="2025-10-03 18:36:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 18:36:50.891638482 +0000 UTC m=+1352.607579354" watchObservedRunningTime="2025-10-03 18:36:50.901044245 +0000 UTC m=+1352.616985127" Oct 03 18:36:55 crc kubenswrapper[4835]: I1003 18:36:55.139647 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.782153 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mc5tw"] Oct 03 18:36:58 crc kubenswrapper[4835]: E1003 18:36:58.783314 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" containerName="dnsmasq-dns" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.783330 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" containerName="dnsmasq-dns" Oct 03 18:36:58 crc kubenswrapper[4835]: E1003 18:36:58.783352 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7729606d-e769-4a52-9ebc-830f6a79d3ff" containerName="init" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.783359 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7729606d-e769-4a52-9ebc-830f6a79d3ff" containerName="init" Oct 03 18:36:58 crc kubenswrapper[4835]: E1003 18:36:58.783393 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" containerName="init" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.783400 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" containerName="init" Oct 03 18:36:58 crc kubenswrapper[4835]: E1003 18:36:58.783413 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7729606d-e769-4a52-9ebc-830f6a79d3ff" containerName="dnsmasq-dns" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.783418 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7729606d-e769-4a52-9ebc-830f6a79d3ff" containerName="dnsmasq-dns" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.783609 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7729606d-e769-4a52-9ebc-830f6a79d3ff" containerName="dnsmasq-dns" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.783641 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a94ea8-dadc-4f3f-bd9d-b07a26adbb83" containerName="dnsmasq-dns" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.785031 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.811351 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc5tw"] Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.859853 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85fk9\" (UniqueName: \"kubernetes.io/projected/129015b0-9c98-40ff-ae05-049a58dacb3c-kube-api-access-85fk9\") pod \"redhat-marketplace-mc5tw\" (UID: \"129015b0-9c98-40ff-ae05-049a58dacb3c\") " pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.860158 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129015b0-9c98-40ff-ae05-049a58dacb3c-catalog-content\") pod \"redhat-marketplace-mc5tw\" (UID: \"129015b0-9c98-40ff-ae05-049a58dacb3c\") " pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.860364 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129015b0-9c98-40ff-ae05-049a58dacb3c-utilities\") pod \"redhat-marketplace-mc5tw\" (UID: \"129015b0-9c98-40ff-ae05-049a58dacb3c\") " pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.961675 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129015b0-9c98-40ff-ae05-049a58dacb3c-catalog-content\") pod \"redhat-marketplace-mc5tw\" (UID: \"129015b0-9c98-40ff-ae05-049a58dacb3c\") " pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.961775 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129015b0-9c98-40ff-ae05-049a58dacb3c-utilities\") pod \"redhat-marketplace-mc5tw\" (UID: \"129015b0-9c98-40ff-ae05-049a58dacb3c\") " pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.961885 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85fk9\" (UniqueName: \"kubernetes.io/projected/129015b0-9c98-40ff-ae05-049a58dacb3c-kube-api-access-85fk9\") pod \"redhat-marketplace-mc5tw\" (UID: \"129015b0-9c98-40ff-ae05-049a58dacb3c\") " pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.962435 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129015b0-9c98-40ff-ae05-049a58dacb3c-utilities\") pod \"redhat-marketplace-mc5tw\" (UID: \"129015b0-9c98-40ff-ae05-049a58dacb3c\") " pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.962572 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129015b0-9c98-40ff-ae05-049a58dacb3c-catalog-content\") pod \"redhat-marketplace-mc5tw\" (UID: \"129015b0-9c98-40ff-ae05-049a58dacb3c\") " pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:36:58 crc kubenswrapper[4835]: I1003 18:36:58.983670 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85fk9\" (UniqueName: \"kubernetes.io/projected/129015b0-9c98-40ff-ae05-049a58dacb3c-kube-api-access-85fk9\") pod \"redhat-marketplace-mc5tw\" (UID: \"129015b0-9c98-40ff-ae05-049a58dacb3c\") " pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:36:59 crc kubenswrapper[4835]: I1003 18:36:59.104446 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:36:59 crc kubenswrapper[4835]: I1003 18:36:59.614378 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc5tw"] Oct 03 18:36:59 crc kubenswrapper[4835]: I1003 18:36:59.976784 4835 generic.go:334] "Generic (PLEG): container finished" podID="129015b0-9c98-40ff-ae05-049a58dacb3c" containerID="8a5a363eb9eb5710e77c44988c3f8a60479345c5dd2c5629ff4bd750cf79a213" exitCode=0 Oct 03 18:36:59 crc kubenswrapper[4835]: I1003 18:36:59.977152 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc5tw" event={"ID":"129015b0-9c98-40ff-ae05-049a58dacb3c","Type":"ContainerDied","Data":"8a5a363eb9eb5710e77c44988c3f8a60479345c5dd2c5629ff4bd750cf79a213"} Oct 03 18:36:59 crc kubenswrapper[4835]: I1003 18:36:59.977184 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc5tw" event={"ID":"129015b0-9c98-40ff-ae05-049a58dacb3c","Type":"ContainerStarted","Data":"853f034785d4cf625647f7ad753fa2a3ccfc69890c836c2a1ea9a9555ccda8b2"} Oct 03 18:37:00 crc kubenswrapper[4835]: I1003 18:37:00.991414 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc5tw" event={"ID":"129015b0-9c98-40ff-ae05-049a58dacb3c","Type":"ContainerStarted","Data":"099a3c483af5d50cfc1246b4b24454e52cfe98fb127107e2053d883269b8d135"} Oct 03 18:37:02 crc kubenswrapper[4835]: I1003 18:37:02.009866 4835 generic.go:334] "Generic (PLEG): container finished" podID="129015b0-9c98-40ff-ae05-049a58dacb3c" containerID="099a3c483af5d50cfc1246b4b24454e52cfe98fb127107e2053d883269b8d135" exitCode=0 Oct 03 18:37:02 crc kubenswrapper[4835]: I1003 18:37:02.010056 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc5tw" event={"ID":"129015b0-9c98-40ff-ae05-049a58dacb3c","Type":"ContainerDied","Data":"099a3c483af5d50cfc1246b4b24454e52cfe98fb127107e2053d883269b8d135"} Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.021015 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc5tw" event={"ID":"129015b0-9c98-40ff-ae05-049a58dacb3c","Type":"ContainerStarted","Data":"2a9274b10f664e7708c8e2aea97b6a1394bbad58fcdee64db3faf69618498d65"} Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.041452 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mc5tw" podStartSLOduration=2.609456729 podStartE2EDuration="5.04143445s" podCreationTimestamp="2025-10-03 18:36:58 +0000 UTC" firstStartedPulling="2025-10-03 18:36:59.978849992 +0000 UTC m=+1361.694790864" lastFinishedPulling="2025-10-03 18:37:02.410827713 +0000 UTC m=+1364.126768585" observedRunningTime="2025-10-03 18:37:03.039371248 +0000 UTC m=+1364.755312130" watchObservedRunningTime="2025-10-03 18:37:03.04143445 +0000 UTC m=+1364.757375322" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.317800 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48"] Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.319116 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.323993 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.324032 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.324032 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.324193 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.328387 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48"] Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.457193 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f2b6\" (UniqueName: \"kubernetes.io/projected/1571f66f-2633-4835-ba3f-db5f52eefb9b-kube-api-access-6f2b6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.457283 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.457301 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.457429 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.559234 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f2b6\" (UniqueName: \"kubernetes.io/projected/1571f66f-2633-4835-ba3f-db5f52eefb9b-kube-api-access-6f2b6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.559362 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.559390 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.559426 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.565781 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.572738 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.574181 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.575793 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f2b6\" (UniqueName: \"kubernetes.io/projected/1571f66f-2633-4835-ba3f-db5f52eefb9b-kube-api-access-6f2b6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:03 crc kubenswrapper[4835]: I1003 18:37:03.636601 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:04 crc kubenswrapper[4835]: I1003 18:37:04.215254 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 18:37:04 crc kubenswrapper[4835]: I1003 18:37:04.215607 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48"] Oct 03 18:37:04 crc kubenswrapper[4835]: W1003 18:37:04.219912 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1571f66f_2633_4835_ba3f_db5f52eefb9b.slice/crio-4b38239e8cc226f9a16bf31520e50f1c85d8adb7b41798eef6045eb45477b3a0 WatchSource:0}: Error finding container 4b38239e8cc226f9a16bf31520e50f1c85d8adb7b41798eef6045eb45477b3a0: Status 404 returned error can't find the container with id 4b38239e8cc226f9a16bf31520e50f1c85d8adb7b41798eef6045eb45477b3a0 Oct 03 18:37:05 crc kubenswrapper[4835]: I1003 18:37:05.041811 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" event={"ID":"1571f66f-2633-4835-ba3f-db5f52eefb9b","Type":"ContainerStarted","Data":"4b38239e8cc226f9a16bf31520e50f1c85d8adb7b41798eef6045eb45477b3a0"} Oct 03 18:37:05 crc kubenswrapper[4835]: I1003 18:37:05.142226 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 18:37:05 crc kubenswrapper[4835]: I1003 18:37:05.358739 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:37:05 crc kubenswrapper[4835]: I1003 18:37:05.358786 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:37:05 crc kubenswrapper[4835]: I1003 18:37:05.358829 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:37:05 crc kubenswrapper[4835]: I1003 18:37:05.359528 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6256e41e920d222ccf54930d399b21cd032b6d7ace88624e5e3fa3510d642ea"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 18:37:05 crc kubenswrapper[4835]: I1003 18:37:05.359582 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://f6256e41e920d222ccf54930d399b21cd032b6d7ace88624e5e3fa3510d642ea" gracePeriod=600 Oct 03 18:37:06 crc kubenswrapper[4835]: I1003 18:37:06.064951 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="f6256e41e920d222ccf54930d399b21cd032b6d7ace88624e5e3fa3510d642ea" exitCode=0 Oct 03 18:37:06 crc kubenswrapper[4835]: I1003 18:37:06.065006 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"f6256e41e920d222ccf54930d399b21cd032b6d7ace88624e5e3fa3510d642ea"} Oct 03 18:37:06 crc kubenswrapper[4835]: I1003 18:37:06.065296 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce"} Oct 03 18:37:06 crc kubenswrapper[4835]: I1003 18:37:06.065318 4835 scope.go:117] "RemoveContainer" containerID="5001daf2345cdee7613b40d98138459acb007dbdcb955c73ad790b203e897d4f" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.105832 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.106364 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.140339 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fmt76"] Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.145178 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.156087 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fmt76"] Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.174585 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.254578 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tdqf\" (UniqueName: \"kubernetes.io/projected/d8568964-6947-41d2-9e44-9ab1a39edc00-kube-api-access-7tdqf\") pod \"redhat-operators-fmt76\" (UID: \"d8568964-6947-41d2-9e44-9ab1a39edc00\") " pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.254684 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8568964-6947-41d2-9e44-9ab1a39edc00-utilities\") pod \"redhat-operators-fmt76\" (UID: \"d8568964-6947-41d2-9e44-9ab1a39edc00\") " pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.254739 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8568964-6947-41d2-9e44-9ab1a39edc00-catalog-content\") pod \"redhat-operators-fmt76\" (UID: \"d8568964-6947-41d2-9e44-9ab1a39edc00\") " pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.356893 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8568964-6947-41d2-9e44-9ab1a39edc00-utilities\") pod \"redhat-operators-fmt76\" (UID: \"d8568964-6947-41d2-9e44-9ab1a39edc00\") " pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.356943 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8568964-6947-41d2-9e44-9ab1a39edc00-catalog-content\") pod \"redhat-operators-fmt76\" (UID: \"d8568964-6947-41d2-9e44-9ab1a39edc00\") " pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.357152 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tdqf\" (UniqueName: \"kubernetes.io/projected/d8568964-6947-41d2-9e44-9ab1a39edc00-kube-api-access-7tdqf\") pod \"redhat-operators-fmt76\" (UID: \"d8568964-6947-41d2-9e44-9ab1a39edc00\") " pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.357415 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8568964-6947-41d2-9e44-9ab1a39edc00-utilities\") pod \"redhat-operators-fmt76\" (UID: \"d8568964-6947-41d2-9e44-9ab1a39edc00\") " pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.357712 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8568964-6947-41d2-9e44-9ab1a39edc00-catalog-content\") pod \"redhat-operators-fmt76\" (UID: \"d8568964-6947-41d2-9e44-9ab1a39edc00\") " pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.387284 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tdqf\" (UniqueName: \"kubernetes.io/projected/d8568964-6947-41d2-9e44-9ab1a39edc00-kube-api-access-7tdqf\") pod \"redhat-operators-fmt76\" (UID: \"d8568964-6947-41d2-9e44-9ab1a39edc00\") " pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:09 crc kubenswrapper[4835]: I1003 18:37:09.488453 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:10 crc kubenswrapper[4835]: I1003 18:37:10.155013 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:37:11 crc kubenswrapper[4835]: I1003 18:37:11.515349 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc5tw"] Oct 03 18:37:12 crc kubenswrapper[4835]: I1003 18:37:12.137612 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mc5tw" podUID="129015b0-9c98-40ff-ae05-049a58dacb3c" containerName="registry-server" containerID="cri-o://2a9274b10f664e7708c8e2aea97b6a1394bbad58fcdee64db3faf69618498d65" gracePeriod=2 Oct 03 18:37:13 crc kubenswrapper[4835]: I1003 18:37:13.152384 4835 generic.go:334] "Generic (PLEG): container finished" podID="129015b0-9c98-40ff-ae05-049a58dacb3c" containerID="2a9274b10f664e7708c8e2aea97b6a1394bbad58fcdee64db3faf69618498d65" exitCode=0 Oct 03 18:37:13 crc kubenswrapper[4835]: I1003 18:37:13.152454 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc5tw" event={"ID":"129015b0-9c98-40ff-ae05-049a58dacb3c","Type":"ContainerDied","Data":"2a9274b10f664e7708c8e2aea97b6a1394bbad58fcdee64db3faf69618498d65"} Oct 03 18:37:14 crc kubenswrapper[4835]: I1003 18:37:14.201953 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:37:14 crc kubenswrapper[4835]: I1003 18:37:14.275771 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129015b0-9c98-40ff-ae05-049a58dacb3c-utilities\") pod \"129015b0-9c98-40ff-ae05-049a58dacb3c\" (UID: \"129015b0-9c98-40ff-ae05-049a58dacb3c\") " Oct 03 18:37:14 crc kubenswrapper[4835]: I1003 18:37:14.276010 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85fk9\" (UniqueName: \"kubernetes.io/projected/129015b0-9c98-40ff-ae05-049a58dacb3c-kube-api-access-85fk9\") pod \"129015b0-9c98-40ff-ae05-049a58dacb3c\" (UID: \"129015b0-9c98-40ff-ae05-049a58dacb3c\") " Oct 03 18:37:14 crc kubenswrapper[4835]: I1003 18:37:14.276129 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129015b0-9c98-40ff-ae05-049a58dacb3c-catalog-content\") pod \"129015b0-9c98-40ff-ae05-049a58dacb3c\" (UID: \"129015b0-9c98-40ff-ae05-049a58dacb3c\") " Oct 03 18:37:14 crc kubenswrapper[4835]: I1003 18:37:14.276775 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/129015b0-9c98-40ff-ae05-049a58dacb3c-utilities" (OuterVolumeSpecName: "utilities") pod "129015b0-9c98-40ff-ae05-049a58dacb3c" (UID: "129015b0-9c98-40ff-ae05-049a58dacb3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:37:14 crc kubenswrapper[4835]: I1003 18:37:14.280105 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/129015b0-9c98-40ff-ae05-049a58dacb3c-kube-api-access-85fk9" (OuterVolumeSpecName: "kube-api-access-85fk9") pod "129015b0-9c98-40ff-ae05-049a58dacb3c" (UID: "129015b0-9c98-40ff-ae05-049a58dacb3c"). InnerVolumeSpecName "kube-api-access-85fk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:37:14 crc kubenswrapper[4835]: I1003 18:37:14.286704 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/129015b0-9c98-40ff-ae05-049a58dacb3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "129015b0-9c98-40ff-ae05-049a58dacb3c" (UID: "129015b0-9c98-40ff-ae05-049a58dacb3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:37:14 crc kubenswrapper[4835]: I1003 18:37:14.371481 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fmt76"] Oct 03 18:37:14 crc kubenswrapper[4835]: W1003 18:37:14.372998 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8568964_6947_41d2_9e44_9ab1a39edc00.slice/crio-6eae1f366fcac7a6c1bb0efcec08a180cb04f1b0e58f6f3358bfb52ff8be4a0f WatchSource:0}: Error finding container 6eae1f366fcac7a6c1bb0efcec08a180cb04f1b0e58f6f3358bfb52ff8be4a0f: Status 404 returned error can't find the container with id 6eae1f366fcac7a6c1bb0efcec08a180cb04f1b0e58f6f3358bfb52ff8be4a0f Oct 03 18:37:14 crc kubenswrapper[4835]: I1003 18:37:14.378016 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/129015b0-9c98-40ff-ae05-049a58dacb3c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:37:14 crc kubenswrapper[4835]: I1003 18:37:14.378048 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85fk9\" (UniqueName: \"kubernetes.io/projected/129015b0-9c98-40ff-ae05-049a58dacb3c-kube-api-access-85fk9\") on node \"crc\" DevicePath \"\"" Oct 03 18:37:14 crc kubenswrapper[4835]: I1003 18:37:14.378059 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/129015b0-9c98-40ff-ae05-049a58dacb3c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:37:15 crc kubenswrapper[4835]: I1003 18:37:15.172868 4835 generic.go:334] "Generic (PLEG): container finished" podID="d8568964-6947-41d2-9e44-9ab1a39edc00" containerID="68d3e272c2b523c126f92ae58c80df6b10d16bec35433b7844c87963d64a8281" exitCode=0 Oct 03 18:37:15 crc kubenswrapper[4835]: I1003 18:37:15.173427 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmt76" event={"ID":"d8568964-6947-41d2-9e44-9ab1a39edc00","Type":"ContainerDied","Data":"68d3e272c2b523c126f92ae58c80df6b10d16bec35433b7844c87963d64a8281"} Oct 03 18:37:15 crc kubenswrapper[4835]: I1003 18:37:15.173494 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmt76" event={"ID":"d8568964-6947-41d2-9e44-9ab1a39edc00","Type":"ContainerStarted","Data":"6eae1f366fcac7a6c1bb0efcec08a180cb04f1b0e58f6f3358bfb52ff8be4a0f"} Oct 03 18:37:15 crc kubenswrapper[4835]: I1003 18:37:15.174857 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" event={"ID":"1571f66f-2633-4835-ba3f-db5f52eefb9b","Type":"ContainerStarted","Data":"b37ed088ee6897e7769d5d7f1f9ba094f8ff1875a2279172cbc6ee53bdaa2ebe"} Oct 03 18:37:15 crc kubenswrapper[4835]: I1003 18:37:15.190528 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mc5tw" event={"ID":"129015b0-9c98-40ff-ae05-049a58dacb3c","Type":"ContainerDied","Data":"853f034785d4cf625647f7ad753fa2a3ccfc69890c836c2a1ea9a9555ccda8b2"} Oct 03 18:37:15 crc kubenswrapper[4835]: I1003 18:37:15.190736 4835 scope.go:117] "RemoveContainer" containerID="2a9274b10f664e7708c8e2aea97b6a1394bbad58fcdee64db3faf69618498d65" Oct 03 18:37:15 crc kubenswrapper[4835]: I1003 18:37:15.190973 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mc5tw" Oct 03 18:37:15 crc kubenswrapper[4835]: I1003 18:37:15.225954 4835 scope.go:117] "RemoveContainer" containerID="099a3c483af5d50cfc1246b4b24454e52cfe98fb127107e2053d883269b8d135" Oct 03 18:37:15 crc kubenswrapper[4835]: I1003 18:37:15.229887 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" podStartSLOduration=2.525183775 podStartE2EDuration="12.229860981s" podCreationTimestamp="2025-10-03 18:37:03 +0000 UTC" firstStartedPulling="2025-10-03 18:37:04.22320901 +0000 UTC m=+1365.939149882" lastFinishedPulling="2025-10-03 18:37:13.927886226 +0000 UTC m=+1375.643827088" observedRunningTime="2025-10-03 18:37:15.224035126 +0000 UTC m=+1376.939976008" watchObservedRunningTime="2025-10-03 18:37:15.229860981 +0000 UTC m=+1376.945801863" Oct 03 18:37:15 crc kubenswrapper[4835]: I1003 18:37:15.263359 4835 scope.go:117] "RemoveContainer" containerID="8a5a363eb9eb5710e77c44988c3f8a60479345c5dd2c5629ff4bd750cf79a213" Oct 03 18:37:15 crc kubenswrapper[4835]: I1003 18:37:15.266170 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc5tw"] Oct 03 18:37:15 crc kubenswrapper[4835]: I1003 18:37:15.275750 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mc5tw"] Oct 03 18:37:16 crc kubenswrapper[4835]: I1003 18:37:16.893326 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="129015b0-9c98-40ff-ae05-049a58dacb3c" path="/var/lib/kubelet/pods/129015b0-9c98-40ff-ae05-049a58dacb3c/volumes" Oct 03 18:37:17 crc kubenswrapper[4835]: I1003 18:37:17.210704 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmt76" event={"ID":"d8568964-6947-41d2-9e44-9ab1a39edc00","Type":"ContainerStarted","Data":"ca4829f04c6528cb6bb20644bd5efd71e1fc306e89e20b274f1321de37dee52b"} Oct 03 18:37:19 crc kubenswrapper[4835]: I1003 18:37:19.275112 4835 generic.go:334] "Generic (PLEG): container finished" podID="d8568964-6947-41d2-9e44-9ab1a39edc00" containerID="ca4829f04c6528cb6bb20644bd5efd71e1fc306e89e20b274f1321de37dee52b" exitCode=0 Oct 03 18:37:19 crc kubenswrapper[4835]: I1003 18:37:19.275224 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmt76" event={"ID":"d8568964-6947-41d2-9e44-9ab1a39edc00","Type":"ContainerDied","Data":"ca4829f04c6528cb6bb20644bd5efd71e1fc306e89e20b274f1321de37dee52b"} Oct 03 18:37:20 crc kubenswrapper[4835]: I1003 18:37:20.286837 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmt76" event={"ID":"d8568964-6947-41d2-9e44-9ab1a39edc00","Type":"ContainerStarted","Data":"34b5f93316af18a1da2e5ad8617f0430270e637310c306b375d2ec8d5e7caf29"} Oct 03 18:37:20 crc kubenswrapper[4835]: I1003 18:37:20.319192 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fmt76" podStartSLOduration=6.699229935 podStartE2EDuration="11.319164326s" podCreationTimestamp="2025-10-03 18:37:09 +0000 UTC" firstStartedPulling="2025-10-03 18:37:15.176581948 +0000 UTC m=+1376.892522820" lastFinishedPulling="2025-10-03 18:37:19.796516339 +0000 UTC m=+1381.512457211" observedRunningTime="2025-10-03 18:37:20.311134606 +0000 UTC m=+1382.027075488" watchObservedRunningTime="2025-10-03 18:37:20.319164326 +0000 UTC m=+1382.035105218" Oct 03 18:37:25 crc kubenswrapper[4835]: I1003 18:37:25.335892 4835 generic.go:334] "Generic (PLEG): container finished" podID="1571f66f-2633-4835-ba3f-db5f52eefb9b" containerID="b37ed088ee6897e7769d5d7f1f9ba094f8ff1875a2279172cbc6ee53bdaa2ebe" exitCode=0 Oct 03 18:37:25 crc kubenswrapper[4835]: I1003 18:37:25.335965 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" event={"ID":"1571f66f-2633-4835-ba3f-db5f52eefb9b","Type":"ContainerDied","Data":"b37ed088ee6897e7769d5d7f1f9ba094f8ff1875a2279172cbc6ee53bdaa2ebe"} Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:26.999858 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.027930 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-repo-setup-combined-ca-bundle\") pod \"1571f66f-2633-4835-ba3f-db5f52eefb9b\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.028167 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f2b6\" (UniqueName: \"kubernetes.io/projected/1571f66f-2633-4835-ba3f-db5f52eefb9b-kube-api-access-6f2b6\") pod \"1571f66f-2633-4835-ba3f-db5f52eefb9b\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.028260 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-ssh-key\") pod \"1571f66f-2633-4835-ba3f-db5f52eefb9b\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.028300 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-inventory\") pod \"1571f66f-2633-4835-ba3f-db5f52eefb9b\" (UID: \"1571f66f-2633-4835-ba3f-db5f52eefb9b\") " Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.040946 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1571f66f-2633-4835-ba3f-db5f52eefb9b-kube-api-access-6f2b6" (OuterVolumeSpecName: "kube-api-access-6f2b6") pod "1571f66f-2633-4835-ba3f-db5f52eefb9b" (UID: "1571f66f-2633-4835-ba3f-db5f52eefb9b"). InnerVolumeSpecName "kube-api-access-6f2b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.043586 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1571f66f-2633-4835-ba3f-db5f52eefb9b" (UID: "1571f66f-2633-4835-ba3f-db5f52eefb9b"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.078019 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-inventory" (OuterVolumeSpecName: "inventory") pod "1571f66f-2633-4835-ba3f-db5f52eefb9b" (UID: "1571f66f-2633-4835-ba3f-db5f52eefb9b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.085610 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1571f66f-2633-4835-ba3f-db5f52eefb9b" (UID: "1571f66f-2633-4835-ba3f-db5f52eefb9b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.131090 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f2b6\" (UniqueName: \"kubernetes.io/projected/1571f66f-2633-4835-ba3f-db5f52eefb9b-kube-api-access-6f2b6\") on node \"crc\" DevicePath \"\"" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.131134 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.131148 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.131161 4835 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1571f66f-2633-4835-ba3f-db5f52eefb9b-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.364927 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" event={"ID":"1571f66f-2633-4835-ba3f-db5f52eefb9b","Type":"ContainerDied","Data":"4b38239e8cc226f9a16bf31520e50f1c85d8adb7b41798eef6045eb45477b3a0"} Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.364969 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b38239e8cc226f9a16bf31520e50f1c85d8adb7b41798eef6045eb45477b3a0" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.365013 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.444634 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc"] Oct 03 18:37:27 crc kubenswrapper[4835]: E1003 18:37:27.445086 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129015b0-9c98-40ff-ae05-049a58dacb3c" containerName="extract-utilities" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.445103 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="129015b0-9c98-40ff-ae05-049a58dacb3c" containerName="extract-utilities" Oct 03 18:37:27 crc kubenswrapper[4835]: E1003 18:37:27.445116 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1571f66f-2633-4835-ba3f-db5f52eefb9b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.445125 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1571f66f-2633-4835-ba3f-db5f52eefb9b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 18:37:27 crc kubenswrapper[4835]: E1003 18:37:27.445155 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129015b0-9c98-40ff-ae05-049a58dacb3c" containerName="registry-server" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.445162 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="129015b0-9c98-40ff-ae05-049a58dacb3c" containerName="registry-server" Oct 03 18:37:27 crc kubenswrapper[4835]: E1003 18:37:27.445186 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="129015b0-9c98-40ff-ae05-049a58dacb3c" containerName="extract-content" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.445193 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="129015b0-9c98-40ff-ae05-049a58dacb3c" containerName="extract-content" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.445373 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="129015b0-9c98-40ff-ae05-049a58dacb3c" containerName="registry-server" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.445401 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1571f66f-2633-4835-ba3f-db5f52eefb9b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.446120 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.449184 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.449492 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.449629 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.454989 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc"] Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.455472 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.538359 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87be00bb-8652-4279-a481-d69d219cd882-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4mcgc\" (UID: \"87be00bb-8652-4279-a481-d69d219cd882\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.538735 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87be00bb-8652-4279-a481-d69d219cd882-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4mcgc\" (UID: \"87be00bb-8652-4279-a481-d69d219cd882\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.538780 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxn6k\" (UniqueName: \"kubernetes.io/projected/87be00bb-8652-4279-a481-d69d219cd882-kube-api-access-xxn6k\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4mcgc\" (UID: \"87be00bb-8652-4279-a481-d69d219cd882\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.640228 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87be00bb-8652-4279-a481-d69d219cd882-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4mcgc\" (UID: \"87be00bb-8652-4279-a481-d69d219cd882\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.640515 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxn6k\" (UniqueName: \"kubernetes.io/projected/87be00bb-8652-4279-a481-d69d219cd882-kube-api-access-xxn6k\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4mcgc\" (UID: \"87be00bb-8652-4279-a481-d69d219cd882\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.640698 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87be00bb-8652-4279-a481-d69d219cd882-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4mcgc\" (UID: \"87be00bb-8652-4279-a481-d69d219cd882\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.645409 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87be00bb-8652-4279-a481-d69d219cd882-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4mcgc\" (UID: \"87be00bb-8652-4279-a481-d69d219cd882\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.645690 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87be00bb-8652-4279-a481-d69d219cd882-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4mcgc\" (UID: \"87be00bb-8652-4279-a481-d69d219cd882\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.655853 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxn6k\" (UniqueName: \"kubernetes.io/projected/87be00bb-8652-4279-a481-d69d219cd882-kube-api-access-xxn6k\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-4mcgc\" (UID: \"87be00bb-8652-4279-a481-d69d219cd882\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" Oct 03 18:37:27 crc kubenswrapper[4835]: I1003 18:37:27.767999 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" Oct 03 18:37:28 crc kubenswrapper[4835]: I1003 18:37:28.278470 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc"] Oct 03 18:37:28 crc kubenswrapper[4835]: I1003 18:37:28.383681 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" event={"ID":"87be00bb-8652-4279-a481-d69d219cd882","Type":"ContainerStarted","Data":"836b2af68f813ee226fbd4871c57f0e1b36ad8013e3da404c6efd12ab07072b5"} Oct 03 18:37:29 crc kubenswrapper[4835]: I1003 18:37:29.396662 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" event={"ID":"87be00bb-8652-4279-a481-d69d219cd882","Type":"ContainerStarted","Data":"99852c2c736da6d65eff02c26e1acab0d4c58d51ab0bedd5e1f35c06edf0aa14"} Oct 03 18:37:29 crc kubenswrapper[4835]: I1003 18:37:29.488581 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:29 crc kubenswrapper[4835]: I1003 18:37:29.488634 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:29 crc kubenswrapper[4835]: I1003 18:37:29.545969 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:29 crc kubenswrapper[4835]: I1003 18:37:29.569622 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" podStartSLOduration=2.103365728 podStartE2EDuration="2.569602883s" podCreationTimestamp="2025-10-03 18:37:27 +0000 UTC" firstStartedPulling="2025-10-03 18:37:28.276689093 +0000 UTC m=+1389.992629965" lastFinishedPulling="2025-10-03 18:37:28.742926248 +0000 UTC m=+1390.458867120" observedRunningTime="2025-10-03 18:37:29.414383909 +0000 UTC m=+1391.130324781" watchObservedRunningTime="2025-10-03 18:37:29.569602883 +0000 UTC m=+1391.285543755" Oct 03 18:37:30 crc kubenswrapper[4835]: I1003 18:37:30.476979 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:30 crc kubenswrapper[4835]: I1003 18:37:30.541335 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fmt76"] Oct 03 18:37:31 crc kubenswrapper[4835]: I1003 18:37:31.430165 4835 generic.go:334] "Generic (PLEG): container finished" podID="87be00bb-8652-4279-a481-d69d219cd882" containerID="99852c2c736da6d65eff02c26e1acab0d4c58d51ab0bedd5e1f35c06edf0aa14" exitCode=0 Oct 03 18:37:31 crc kubenswrapper[4835]: I1003 18:37:31.430282 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" event={"ID":"87be00bb-8652-4279-a481-d69d219cd882","Type":"ContainerDied","Data":"99852c2c736da6d65eff02c26e1acab0d4c58d51ab0bedd5e1f35c06edf0aa14"} Oct 03 18:37:32 crc kubenswrapper[4835]: I1003 18:37:32.438843 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fmt76" podUID="d8568964-6947-41d2-9e44-9ab1a39edc00" containerName="registry-server" containerID="cri-o://34b5f93316af18a1da2e5ad8617f0430270e637310c306b375d2ec8d5e7caf29" gracePeriod=2 Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.047943 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.056121 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.161309 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87be00bb-8652-4279-a481-d69d219cd882-inventory\") pod \"87be00bb-8652-4279-a481-d69d219cd882\" (UID: \"87be00bb-8652-4279-a481-d69d219cd882\") " Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.161393 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87be00bb-8652-4279-a481-d69d219cd882-ssh-key\") pod \"87be00bb-8652-4279-a481-d69d219cd882\" (UID: \"87be00bb-8652-4279-a481-d69d219cd882\") " Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.161498 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxn6k\" (UniqueName: \"kubernetes.io/projected/87be00bb-8652-4279-a481-d69d219cd882-kube-api-access-xxn6k\") pod \"87be00bb-8652-4279-a481-d69d219cd882\" (UID: \"87be00bb-8652-4279-a481-d69d219cd882\") " Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.161537 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8568964-6947-41d2-9e44-9ab1a39edc00-catalog-content\") pod \"d8568964-6947-41d2-9e44-9ab1a39edc00\" (UID: \"d8568964-6947-41d2-9e44-9ab1a39edc00\") " Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.162273 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8568964-6947-41d2-9e44-9ab1a39edc00-utilities\") pod \"d8568964-6947-41d2-9e44-9ab1a39edc00\" (UID: \"d8568964-6947-41d2-9e44-9ab1a39edc00\") " Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.162329 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tdqf\" (UniqueName: \"kubernetes.io/projected/d8568964-6947-41d2-9e44-9ab1a39edc00-kube-api-access-7tdqf\") pod \"d8568964-6947-41d2-9e44-9ab1a39edc00\" (UID: \"d8568964-6947-41d2-9e44-9ab1a39edc00\") " Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.163786 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8568964-6947-41d2-9e44-9ab1a39edc00-utilities" (OuterVolumeSpecName: "utilities") pod "d8568964-6947-41d2-9e44-9ab1a39edc00" (UID: "d8568964-6947-41d2-9e44-9ab1a39edc00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.167270 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87be00bb-8652-4279-a481-d69d219cd882-kube-api-access-xxn6k" (OuterVolumeSpecName: "kube-api-access-xxn6k") pod "87be00bb-8652-4279-a481-d69d219cd882" (UID: "87be00bb-8652-4279-a481-d69d219cd882"). InnerVolumeSpecName "kube-api-access-xxn6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.179237 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8568964-6947-41d2-9e44-9ab1a39edc00-kube-api-access-7tdqf" (OuterVolumeSpecName: "kube-api-access-7tdqf") pod "d8568964-6947-41d2-9e44-9ab1a39edc00" (UID: "d8568964-6947-41d2-9e44-9ab1a39edc00"). InnerVolumeSpecName "kube-api-access-7tdqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.192479 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87be00bb-8652-4279-a481-d69d219cd882-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "87be00bb-8652-4279-a481-d69d219cd882" (UID: "87be00bb-8652-4279-a481-d69d219cd882"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.193569 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87be00bb-8652-4279-a481-d69d219cd882-inventory" (OuterVolumeSpecName: "inventory") pod "87be00bb-8652-4279-a481-d69d219cd882" (UID: "87be00bb-8652-4279-a481-d69d219cd882"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.244738 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8568964-6947-41d2-9e44-9ab1a39edc00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8568964-6947-41d2-9e44-9ab1a39edc00" (UID: "d8568964-6947-41d2-9e44-9ab1a39edc00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.264926 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxn6k\" (UniqueName: \"kubernetes.io/projected/87be00bb-8652-4279-a481-d69d219cd882-kube-api-access-xxn6k\") on node \"crc\" DevicePath \"\"" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.264964 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8568964-6947-41d2-9e44-9ab1a39edc00-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.264974 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8568964-6947-41d2-9e44-9ab1a39edc00-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.264985 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tdqf\" (UniqueName: \"kubernetes.io/projected/d8568964-6947-41d2-9e44-9ab1a39edc00-kube-api-access-7tdqf\") on node \"crc\" DevicePath \"\"" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.264995 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87be00bb-8652-4279-a481-d69d219cd882-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.265002 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87be00bb-8652-4279-a481-d69d219cd882-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.450926 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" event={"ID":"87be00bb-8652-4279-a481-d69d219cd882","Type":"ContainerDied","Data":"836b2af68f813ee226fbd4871c57f0e1b36ad8013e3da404c6efd12ab07072b5"} Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.450949 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-4mcgc" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.450973 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="836b2af68f813ee226fbd4871c57f0e1b36ad8013e3da404c6efd12ab07072b5" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.453159 4835 generic.go:334] "Generic (PLEG): container finished" podID="d8568964-6947-41d2-9e44-9ab1a39edc00" containerID="34b5f93316af18a1da2e5ad8617f0430270e637310c306b375d2ec8d5e7caf29" exitCode=0 Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.453197 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmt76" event={"ID":"d8568964-6947-41d2-9e44-9ab1a39edc00","Type":"ContainerDied","Data":"34b5f93316af18a1da2e5ad8617f0430270e637310c306b375d2ec8d5e7caf29"} Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.453223 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmt76" event={"ID":"d8568964-6947-41d2-9e44-9ab1a39edc00","Type":"ContainerDied","Data":"6eae1f366fcac7a6c1bb0efcec08a180cb04f1b0e58f6f3358bfb52ff8be4a0f"} Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.453239 4835 scope.go:117] "RemoveContainer" containerID="34b5f93316af18a1da2e5ad8617f0430270e637310c306b375d2ec8d5e7caf29" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.453368 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmt76" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.503755 4835 scope.go:117] "RemoveContainer" containerID="ca4829f04c6528cb6bb20644bd5efd71e1fc306e89e20b274f1321de37dee52b" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.542357 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fmt76"] Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.543903 4835 scope.go:117] "RemoveContainer" containerID="68d3e272c2b523c126f92ae58c80df6b10d16bec35433b7844c87963d64a8281" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.599647 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fmt76"] Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.612586 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns"] Oct 03 18:37:33 crc kubenswrapper[4835]: E1003 18:37:33.613035 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8568964-6947-41d2-9e44-9ab1a39edc00" containerName="extract-content" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.613052 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8568964-6947-41d2-9e44-9ab1a39edc00" containerName="extract-content" Oct 03 18:37:33 crc kubenswrapper[4835]: E1003 18:37:33.613081 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8568964-6947-41d2-9e44-9ab1a39edc00" containerName="registry-server" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.613087 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8568964-6947-41d2-9e44-9ab1a39edc00" containerName="registry-server" Oct 03 18:37:33 crc kubenswrapper[4835]: E1003 18:37:33.613102 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8568964-6947-41d2-9e44-9ab1a39edc00" containerName="extract-utilities" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.613109 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8568964-6947-41d2-9e44-9ab1a39edc00" containerName="extract-utilities" Oct 03 18:37:33 crc kubenswrapper[4835]: E1003 18:37:33.613126 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87be00bb-8652-4279-a481-d69d219cd882" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.613132 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="87be00bb-8652-4279-a481-d69d219cd882" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.613308 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8568964-6947-41d2-9e44-9ab1a39edc00" containerName="registry-server" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.613334 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="87be00bb-8652-4279-a481-d69d219cd882" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.614032 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.617770 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.618147 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.618978 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.619348 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.631488 4835 scope.go:117] "RemoveContainer" containerID="34b5f93316af18a1da2e5ad8617f0430270e637310c306b375d2ec8d5e7caf29" Oct 03 18:37:33 crc kubenswrapper[4835]: E1003 18:37:33.632055 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b5f93316af18a1da2e5ad8617f0430270e637310c306b375d2ec8d5e7caf29\": container with ID starting with 34b5f93316af18a1da2e5ad8617f0430270e637310c306b375d2ec8d5e7caf29 not found: ID does not exist" containerID="34b5f93316af18a1da2e5ad8617f0430270e637310c306b375d2ec8d5e7caf29" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.632114 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b5f93316af18a1da2e5ad8617f0430270e637310c306b375d2ec8d5e7caf29"} err="failed to get container status \"34b5f93316af18a1da2e5ad8617f0430270e637310c306b375d2ec8d5e7caf29\": rpc error: code = NotFound desc = could not find container \"34b5f93316af18a1da2e5ad8617f0430270e637310c306b375d2ec8d5e7caf29\": container with ID starting with 34b5f93316af18a1da2e5ad8617f0430270e637310c306b375d2ec8d5e7caf29 not found: ID does not exist" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.632142 4835 scope.go:117] "RemoveContainer" containerID="ca4829f04c6528cb6bb20644bd5efd71e1fc306e89e20b274f1321de37dee52b" Oct 03 18:37:33 crc kubenswrapper[4835]: E1003 18:37:33.632540 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca4829f04c6528cb6bb20644bd5efd71e1fc306e89e20b274f1321de37dee52b\": container with ID starting with ca4829f04c6528cb6bb20644bd5efd71e1fc306e89e20b274f1321de37dee52b not found: ID does not exist" containerID="ca4829f04c6528cb6bb20644bd5efd71e1fc306e89e20b274f1321de37dee52b" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.632578 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4829f04c6528cb6bb20644bd5efd71e1fc306e89e20b274f1321de37dee52b"} err="failed to get container status \"ca4829f04c6528cb6bb20644bd5efd71e1fc306e89e20b274f1321de37dee52b\": rpc error: code = NotFound desc = could not find container \"ca4829f04c6528cb6bb20644bd5efd71e1fc306e89e20b274f1321de37dee52b\": container with ID starting with ca4829f04c6528cb6bb20644bd5efd71e1fc306e89e20b274f1321de37dee52b not found: ID does not exist" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.632605 4835 scope.go:117] "RemoveContainer" containerID="68d3e272c2b523c126f92ae58c80df6b10d16bec35433b7844c87963d64a8281" Oct 03 18:37:33 crc kubenswrapper[4835]: E1003 18:37:33.632864 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d3e272c2b523c126f92ae58c80df6b10d16bec35433b7844c87963d64a8281\": container with ID starting with 68d3e272c2b523c126f92ae58c80df6b10d16bec35433b7844c87963d64a8281 not found: ID does not exist" containerID="68d3e272c2b523c126f92ae58c80df6b10d16bec35433b7844c87963d64a8281" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.632879 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d3e272c2b523c126f92ae58c80df6b10d16bec35433b7844c87963d64a8281"} err="failed to get container status \"68d3e272c2b523c126f92ae58c80df6b10d16bec35433b7844c87963d64a8281\": rpc error: code = NotFound desc = could not find container \"68d3e272c2b523c126f92ae58c80df6b10d16bec35433b7844c87963d64a8281\": container with ID starting with 68d3e272c2b523c126f92ae58c80df6b10d16bec35433b7844c87963d64a8281 not found: ID does not exist" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.643463 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns"] Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.673735 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.673976 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzrml\" (UniqueName: \"kubernetes.io/projected/99ce37b7-29b9-44ed-a066-bc503ae35b61-kube-api-access-pzrml\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.674109 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.674573 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.776219 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.776318 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.776360 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzrml\" (UniqueName: \"kubernetes.io/projected/99ce37b7-29b9-44ed-a066-bc503ae35b61-kube-api-access-pzrml\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.776392 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.780143 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.780155 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.783780 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.792192 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzrml\" (UniqueName: \"kubernetes.io/projected/99ce37b7-29b9-44ed-a066-bc503ae35b61-kube-api-access-pzrml\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:33 crc kubenswrapper[4835]: I1003 18:37:33.964289 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:37:34 crc kubenswrapper[4835]: I1003 18:37:34.507514 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns"] Oct 03 18:37:34 crc kubenswrapper[4835]: I1003 18:37:34.893669 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8568964-6947-41d2-9e44-9ab1a39edc00" path="/var/lib/kubelet/pods/d8568964-6947-41d2-9e44-9ab1a39edc00/volumes" Oct 03 18:37:35 crc kubenswrapper[4835]: I1003 18:37:35.474235 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" event={"ID":"99ce37b7-29b9-44ed-a066-bc503ae35b61","Type":"ContainerStarted","Data":"978f1be6ffa7a43cd1652de497582733a750323e90ccefd50633bf74c32d7471"} Oct 03 18:37:35 crc kubenswrapper[4835]: I1003 18:37:35.474556 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" event={"ID":"99ce37b7-29b9-44ed-a066-bc503ae35b61","Type":"ContainerStarted","Data":"e8f0077a8d82c6b45b764688b5f246207206f14a1f200c42599b5b1b7ca585d5"} Oct 03 18:37:35 crc kubenswrapper[4835]: I1003 18:37:35.497228 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" podStartSLOduration=1.966626428 podStartE2EDuration="2.497211791s" podCreationTimestamp="2025-10-03 18:37:33 +0000 UTC" firstStartedPulling="2025-10-03 18:37:34.51560018 +0000 UTC m=+1396.231541052" lastFinishedPulling="2025-10-03 18:37:35.046185543 +0000 UTC m=+1396.762126415" observedRunningTime="2025-10-03 18:37:35.496411032 +0000 UTC m=+1397.212351904" watchObservedRunningTime="2025-10-03 18:37:35.497211791 +0000 UTC m=+1397.213152663" Oct 03 18:37:40 crc kubenswrapper[4835]: I1003 18:37:40.658855 4835 scope.go:117] "RemoveContainer" containerID="e419d1b2ee6873736a130d7289602eb9547f62118a25f036775816834ffc3d68" Oct 03 18:37:40 crc kubenswrapper[4835]: I1003 18:37:40.686947 4835 scope.go:117] "RemoveContainer" containerID="dab9ce77bce4472a506459f98e1685e2f0ed108cb68a34365d95023f2d404c24" Oct 03 18:38:15 crc kubenswrapper[4835]: I1003 18:38:15.794733 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8fmm8"] Oct 03 18:38:15 crc kubenswrapper[4835]: I1003 18:38:15.797514 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:15 crc kubenswrapper[4835]: I1003 18:38:15.805862 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8fmm8"] Oct 03 18:38:15 crc kubenswrapper[4835]: I1003 18:38:15.900915 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkzwz\" (UniqueName: \"kubernetes.io/projected/67e0358c-f426-4b0c-8758-f8ef527f36d0-kube-api-access-tkzwz\") pod \"community-operators-8fmm8\" (UID: \"67e0358c-f426-4b0c-8758-f8ef527f36d0\") " pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:15 crc kubenswrapper[4835]: I1003 18:38:15.901159 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e0358c-f426-4b0c-8758-f8ef527f36d0-utilities\") pod \"community-operators-8fmm8\" (UID: \"67e0358c-f426-4b0c-8758-f8ef527f36d0\") " pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:15 crc kubenswrapper[4835]: I1003 18:38:15.901250 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e0358c-f426-4b0c-8758-f8ef527f36d0-catalog-content\") pod \"community-operators-8fmm8\" (UID: \"67e0358c-f426-4b0c-8758-f8ef527f36d0\") " pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:16 crc kubenswrapper[4835]: I1003 18:38:16.003009 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkzwz\" (UniqueName: \"kubernetes.io/projected/67e0358c-f426-4b0c-8758-f8ef527f36d0-kube-api-access-tkzwz\") pod \"community-operators-8fmm8\" (UID: \"67e0358c-f426-4b0c-8758-f8ef527f36d0\") " pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:16 crc kubenswrapper[4835]: I1003 18:38:16.003303 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e0358c-f426-4b0c-8758-f8ef527f36d0-utilities\") pod \"community-operators-8fmm8\" (UID: \"67e0358c-f426-4b0c-8758-f8ef527f36d0\") " pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:16 crc kubenswrapper[4835]: I1003 18:38:16.003363 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e0358c-f426-4b0c-8758-f8ef527f36d0-catalog-content\") pod \"community-operators-8fmm8\" (UID: \"67e0358c-f426-4b0c-8758-f8ef527f36d0\") " pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:16 crc kubenswrapper[4835]: I1003 18:38:16.003851 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e0358c-f426-4b0c-8758-f8ef527f36d0-catalog-content\") pod \"community-operators-8fmm8\" (UID: \"67e0358c-f426-4b0c-8758-f8ef527f36d0\") " pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:16 crc kubenswrapper[4835]: I1003 18:38:16.004564 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e0358c-f426-4b0c-8758-f8ef527f36d0-utilities\") pod \"community-operators-8fmm8\" (UID: \"67e0358c-f426-4b0c-8758-f8ef527f36d0\") " pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:16 crc kubenswrapper[4835]: I1003 18:38:16.028476 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkzwz\" (UniqueName: \"kubernetes.io/projected/67e0358c-f426-4b0c-8758-f8ef527f36d0-kube-api-access-tkzwz\") pod \"community-operators-8fmm8\" (UID: \"67e0358c-f426-4b0c-8758-f8ef527f36d0\") " pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:16 crc kubenswrapper[4835]: I1003 18:38:16.131406 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:16 crc kubenswrapper[4835]: I1003 18:38:16.612816 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8fmm8"] Oct 03 18:38:16 crc kubenswrapper[4835]: I1003 18:38:16.869630 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fmm8" event={"ID":"67e0358c-f426-4b0c-8758-f8ef527f36d0","Type":"ContainerStarted","Data":"b06e751dd2fc4f58d293683e95f8b242158b261914ff68d31a85e8e43c84b0b4"} Oct 03 18:38:16 crc kubenswrapper[4835]: I1003 18:38:16.869937 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fmm8" event={"ID":"67e0358c-f426-4b0c-8758-f8ef527f36d0","Type":"ContainerStarted","Data":"f4c9f308a66c3f245c04501f3eb2e6a35b35a31b21ae03c795b655e4ac728039"} Oct 03 18:38:17 crc kubenswrapper[4835]: I1003 18:38:17.881763 4835 generic.go:334] "Generic (PLEG): container finished" podID="67e0358c-f426-4b0c-8758-f8ef527f36d0" containerID="b06e751dd2fc4f58d293683e95f8b242158b261914ff68d31a85e8e43c84b0b4" exitCode=0 Oct 03 18:38:17 crc kubenswrapper[4835]: I1003 18:38:17.881895 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fmm8" event={"ID":"67e0358c-f426-4b0c-8758-f8ef527f36d0","Type":"ContainerDied","Data":"b06e751dd2fc4f58d293683e95f8b242158b261914ff68d31a85e8e43c84b0b4"} Oct 03 18:38:19 crc kubenswrapper[4835]: I1003 18:38:19.924120 4835 generic.go:334] "Generic (PLEG): container finished" podID="67e0358c-f426-4b0c-8758-f8ef527f36d0" containerID="828cab164332b03da3e7211ddb8e5fd1cd3dadc1f211b38b96fb6ecb2e6fb449" exitCode=0 Oct 03 18:38:19 crc kubenswrapper[4835]: I1003 18:38:19.924191 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fmm8" event={"ID":"67e0358c-f426-4b0c-8758-f8ef527f36d0","Type":"ContainerDied","Data":"828cab164332b03da3e7211ddb8e5fd1cd3dadc1f211b38b96fb6ecb2e6fb449"} Oct 03 18:38:20 crc kubenswrapper[4835]: I1003 18:38:20.939613 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fmm8" event={"ID":"67e0358c-f426-4b0c-8758-f8ef527f36d0","Type":"ContainerStarted","Data":"bfc47caa7a875465dbc137c7c9b71b94f854e6d8e4a04ed21561b67575fb16a5"} Oct 03 18:38:20 crc kubenswrapper[4835]: I1003 18:38:20.968539 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8fmm8" podStartSLOduration=3.561660144 podStartE2EDuration="5.96852196s" podCreationTimestamp="2025-10-03 18:38:15 +0000 UTC" firstStartedPulling="2025-10-03 18:38:17.884510253 +0000 UTC m=+1439.600451125" lastFinishedPulling="2025-10-03 18:38:20.291372069 +0000 UTC m=+1442.007312941" observedRunningTime="2025-10-03 18:38:20.959863986 +0000 UTC m=+1442.675804858" watchObservedRunningTime="2025-10-03 18:38:20.96852196 +0000 UTC m=+1442.684462832" Oct 03 18:38:26 crc kubenswrapper[4835]: I1003 18:38:26.132406 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:26 crc kubenswrapper[4835]: I1003 18:38:26.132906 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:26 crc kubenswrapper[4835]: I1003 18:38:26.178492 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:27 crc kubenswrapper[4835]: I1003 18:38:27.037889 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:27 crc kubenswrapper[4835]: I1003 18:38:27.083191 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8fmm8"] Oct 03 18:38:29 crc kubenswrapper[4835]: I1003 18:38:29.014445 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8fmm8" podUID="67e0358c-f426-4b0c-8758-f8ef527f36d0" containerName="registry-server" containerID="cri-o://bfc47caa7a875465dbc137c7c9b71b94f854e6d8e4a04ed21561b67575fb16a5" gracePeriod=2 Oct 03 18:38:29 crc kubenswrapper[4835]: I1003 18:38:29.454186 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:29 crc kubenswrapper[4835]: I1003 18:38:29.570948 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e0358c-f426-4b0c-8758-f8ef527f36d0-utilities\") pod \"67e0358c-f426-4b0c-8758-f8ef527f36d0\" (UID: \"67e0358c-f426-4b0c-8758-f8ef527f36d0\") " Oct 03 18:38:29 crc kubenswrapper[4835]: I1003 18:38:29.570997 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkzwz\" (UniqueName: \"kubernetes.io/projected/67e0358c-f426-4b0c-8758-f8ef527f36d0-kube-api-access-tkzwz\") pod \"67e0358c-f426-4b0c-8758-f8ef527f36d0\" (UID: \"67e0358c-f426-4b0c-8758-f8ef527f36d0\") " Oct 03 18:38:29 crc kubenswrapper[4835]: I1003 18:38:29.571038 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e0358c-f426-4b0c-8758-f8ef527f36d0-catalog-content\") pod \"67e0358c-f426-4b0c-8758-f8ef527f36d0\" (UID: \"67e0358c-f426-4b0c-8758-f8ef527f36d0\") " Oct 03 18:38:29 crc kubenswrapper[4835]: I1003 18:38:29.572510 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67e0358c-f426-4b0c-8758-f8ef527f36d0-utilities" (OuterVolumeSpecName: "utilities") pod "67e0358c-f426-4b0c-8758-f8ef527f36d0" (UID: "67e0358c-f426-4b0c-8758-f8ef527f36d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:38:29 crc kubenswrapper[4835]: I1003 18:38:29.580469 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e0358c-f426-4b0c-8758-f8ef527f36d0-kube-api-access-tkzwz" (OuterVolumeSpecName: "kube-api-access-tkzwz") pod "67e0358c-f426-4b0c-8758-f8ef527f36d0" (UID: "67e0358c-f426-4b0c-8758-f8ef527f36d0"). InnerVolumeSpecName "kube-api-access-tkzwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:38:29 crc kubenswrapper[4835]: I1003 18:38:29.629597 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67e0358c-f426-4b0c-8758-f8ef527f36d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67e0358c-f426-4b0c-8758-f8ef527f36d0" (UID: "67e0358c-f426-4b0c-8758-f8ef527f36d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:38:29 crc kubenswrapper[4835]: I1003 18:38:29.673471 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e0358c-f426-4b0c-8758-f8ef527f36d0-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:38:29 crc kubenswrapper[4835]: I1003 18:38:29.673510 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkzwz\" (UniqueName: \"kubernetes.io/projected/67e0358c-f426-4b0c-8758-f8ef527f36d0-kube-api-access-tkzwz\") on node \"crc\" DevicePath \"\"" Oct 03 18:38:29 crc kubenswrapper[4835]: I1003 18:38:29.673523 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e0358c-f426-4b0c-8758-f8ef527f36d0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.024680 4835 generic.go:334] "Generic (PLEG): container finished" podID="67e0358c-f426-4b0c-8758-f8ef527f36d0" containerID="bfc47caa7a875465dbc137c7c9b71b94f854e6d8e4a04ed21561b67575fb16a5" exitCode=0 Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.024728 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fmm8" event={"ID":"67e0358c-f426-4b0c-8758-f8ef527f36d0","Type":"ContainerDied","Data":"bfc47caa7a875465dbc137c7c9b71b94f854e6d8e4a04ed21561b67575fb16a5"} Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.024756 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fmm8" event={"ID":"67e0358c-f426-4b0c-8758-f8ef527f36d0","Type":"ContainerDied","Data":"f4c9f308a66c3f245c04501f3eb2e6a35b35a31b21ae03c795b655e4ac728039"} Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.024777 4835 scope.go:117] "RemoveContainer" containerID="bfc47caa7a875465dbc137c7c9b71b94f854e6d8e4a04ed21561b67575fb16a5" Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.024911 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8fmm8" Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.048661 4835 scope.go:117] "RemoveContainer" containerID="828cab164332b03da3e7211ddb8e5fd1cd3dadc1f211b38b96fb6ecb2e6fb449" Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.064174 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8fmm8"] Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.076219 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8fmm8"] Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.088327 4835 scope.go:117] "RemoveContainer" containerID="b06e751dd2fc4f58d293683e95f8b242158b261914ff68d31a85e8e43c84b0b4" Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.120677 4835 scope.go:117] "RemoveContainer" containerID="bfc47caa7a875465dbc137c7c9b71b94f854e6d8e4a04ed21561b67575fb16a5" Oct 03 18:38:30 crc kubenswrapper[4835]: E1003 18:38:30.121212 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc47caa7a875465dbc137c7c9b71b94f854e6d8e4a04ed21561b67575fb16a5\": container with ID starting with bfc47caa7a875465dbc137c7c9b71b94f854e6d8e4a04ed21561b67575fb16a5 not found: ID does not exist" containerID="bfc47caa7a875465dbc137c7c9b71b94f854e6d8e4a04ed21561b67575fb16a5" Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.121352 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc47caa7a875465dbc137c7c9b71b94f854e6d8e4a04ed21561b67575fb16a5"} err="failed to get container status \"bfc47caa7a875465dbc137c7c9b71b94f854e6d8e4a04ed21561b67575fb16a5\": rpc error: code = NotFound desc = could not find container \"bfc47caa7a875465dbc137c7c9b71b94f854e6d8e4a04ed21561b67575fb16a5\": container with ID starting with bfc47caa7a875465dbc137c7c9b71b94f854e6d8e4a04ed21561b67575fb16a5 not found: ID does not exist" Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.121457 4835 scope.go:117] "RemoveContainer" containerID="828cab164332b03da3e7211ddb8e5fd1cd3dadc1f211b38b96fb6ecb2e6fb449" Oct 03 18:38:30 crc kubenswrapper[4835]: E1003 18:38:30.121957 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"828cab164332b03da3e7211ddb8e5fd1cd3dadc1f211b38b96fb6ecb2e6fb449\": container with ID starting with 828cab164332b03da3e7211ddb8e5fd1cd3dadc1f211b38b96fb6ecb2e6fb449 not found: ID does not exist" containerID="828cab164332b03da3e7211ddb8e5fd1cd3dadc1f211b38b96fb6ecb2e6fb449" Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.122058 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"828cab164332b03da3e7211ddb8e5fd1cd3dadc1f211b38b96fb6ecb2e6fb449"} err="failed to get container status \"828cab164332b03da3e7211ddb8e5fd1cd3dadc1f211b38b96fb6ecb2e6fb449\": rpc error: code = NotFound desc = could not find container \"828cab164332b03da3e7211ddb8e5fd1cd3dadc1f211b38b96fb6ecb2e6fb449\": container with ID starting with 828cab164332b03da3e7211ddb8e5fd1cd3dadc1f211b38b96fb6ecb2e6fb449 not found: ID does not exist" Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.122194 4835 scope.go:117] "RemoveContainer" containerID="b06e751dd2fc4f58d293683e95f8b242158b261914ff68d31a85e8e43c84b0b4" Oct 03 18:38:30 crc kubenswrapper[4835]: E1003 18:38:30.122480 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b06e751dd2fc4f58d293683e95f8b242158b261914ff68d31a85e8e43c84b0b4\": container with ID starting with b06e751dd2fc4f58d293683e95f8b242158b261914ff68d31a85e8e43c84b0b4 not found: ID does not exist" containerID="b06e751dd2fc4f58d293683e95f8b242158b261914ff68d31a85e8e43c84b0b4" Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.122579 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06e751dd2fc4f58d293683e95f8b242158b261914ff68d31a85e8e43c84b0b4"} err="failed to get container status \"b06e751dd2fc4f58d293683e95f8b242158b261914ff68d31a85e8e43c84b0b4\": rpc error: code = NotFound desc = could not find container \"b06e751dd2fc4f58d293683e95f8b242158b261914ff68d31a85e8e43c84b0b4\": container with ID starting with b06e751dd2fc4f58d293683e95f8b242158b261914ff68d31a85e8e43c84b0b4 not found: ID does not exist" Oct 03 18:38:30 crc kubenswrapper[4835]: I1003 18:38:30.888352 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e0358c-f426-4b0c-8758-f8ef527f36d0" path="/var/lib/kubelet/pods/67e0358c-f426-4b0c-8758-f8ef527f36d0/volumes" Oct 03 18:38:31 crc kubenswrapper[4835]: I1003 18:38:31.992253 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8pjjc"] Oct 03 18:38:31 crc kubenswrapper[4835]: E1003 18:38:31.992852 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e0358c-f426-4b0c-8758-f8ef527f36d0" containerName="extract-content" Oct 03 18:38:31 crc kubenswrapper[4835]: I1003 18:38:31.992868 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e0358c-f426-4b0c-8758-f8ef527f36d0" containerName="extract-content" Oct 03 18:38:31 crc kubenswrapper[4835]: E1003 18:38:31.992903 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e0358c-f426-4b0c-8758-f8ef527f36d0" containerName="registry-server" Oct 03 18:38:31 crc kubenswrapper[4835]: I1003 18:38:31.992912 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e0358c-f426-4b0c-8758-f8ef527f36d0" containerName="registry-server" Oct 03 18:38:31 crc kubenswrapper[4835]: E1003 18:38:31.992942 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e0358c-f426-4b0c-8758-f8ef527f36d0" containerName="extract-utilities" Oct 03 18:38:31 crc kubenswrapper[4835]: I1003 18:38:31.992950 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e0358c-f426-4b0c-8758-f8ef527f36d0" containerName="extract-utilities" Oct 03 18:38:31 crc kubenswrapper[4835]: I1003 18:38:31.993237 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e0358c-f426-4b0c-8758-f8ef527f36d0" containerName="registry-server" Oct 03 18:38:31 crc kubenswrapper[4835]: I1003 18:38:31.995184 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:32 crc kubenswrapper[4835]: I1003 18:38:32.005453 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8pjjc"] Oct 03 18:38:32 crc kubenswrapper[4835]: I1003 18:38:32.123390 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-catalog-content\") pod \"certified-operators-8pjjc\" (UID: \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\") " pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:32 crc kubenswrapper[4835]: I1003 18:38:32.123471 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2r4z\" (UniqueName: \"kubernetes.io/projected/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-kube-api-access-q2r4z\") pod \"certified-operators-8pjjc\" (UID: \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\") " pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:32 crc kubenswrapper[4835]: I1003 18:38:32.123517 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-utilities\") pod \"certified-operators-8pjjc\" (UID: \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\") " pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:32 crc kubenswrapper[4835]: I1003 18:38:32.225819 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2r4z\" (UniqueName: \"kubernetes.io/projected/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-kube-api-access-q2r4z\") pod \"certified-operators-8pjjc\" (UID: \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\") " pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:32 crc kubenswrapper[4835]: I1003 18:38:32.226081 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-utilities\") pod \"certified-operators-8pjjc\" (UID: \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\") " pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:32 crc kubenswrapper[4835]: I1003 18:38:32.226339 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-catalog-content\") pod \"certified-operators-8pjjc\" (UID: \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\") " pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:32 crc kubenswrapper[4835]: I1003 18:38:32.226571 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-utilities\") pod \"certified-operators-8pjjc\" (UID: \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\") " pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:32 crc kubenswrapper[4835]: I1003 18:38:32.226923 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-catalog-content\") pod \"certified-operators-8pjjc\" (UID: \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\") " pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:32 crc kubenswrapper[4835]: I1003 18:38:32.250088 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2r4z\" (UniqueName: \"kubernetes.io/projected/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-kube-api-access-q2r4z\") pod \"certified-operators-8pjjc\" (UID: \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\") " pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:32 crc kubenswrapper[4835]: I1003 18:38:32.324101 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:32 crc kubenswrapper[4835]: I1003 18:38:32.846088 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8pjjc"] Oct 03 18:38:33 crc kubenswrapper[4835]: I1003 18:38:33.054995 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pjjc" event={"ID":"2e15124f-a26d-4eac-a79f-7e54f31ef0c2","Type":"ContainerStarted","Data":"c936975bcf2d5459ba0444a9c46e6765849c9525024b17290271e919f9193416"} Oct 03 18:38:34 crc kubenswrapper[4835]: I1003 18:38:34.065171 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e15124f-a26d-4eac-a79f-7e54f31ef0c2" containerID="da8ad492ebad1d0e203802aac793bbdc82e5d63c6e56e0600f0dab1883cc4f2f" exitCode=0 Oct 03 18:38:34 crc kubenswrapper[4835]: I1003 18:38:34.065459 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pjjc" event={"ID":"2e15124f-a26d-4eac-a79f-7e54f31ef0c2","Type":"ContainerDied","Data":"da8ad492ebad1d0e203802aac793bbdc82e5d63c6e56e0600f0dab1883cc4f2f"} Oct 03 18:38:35 crc kubenswrapper[4835]: I1003 18:38:35.078952 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pjjc" event={"ID":"2e15124f-a26d-4eac-a79f-7e54f31ef0c2","Type":"ContainerStarted","Data":"59408663b67b657311712ee8df059a93b279d9cbd5b30478815367d0bb210462"} Oct 03 18:38:36 crc kubenswrapper[4835]: I1003 18:38:36.090428 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e15124f-a26d-4eac-a79f-7e54f31ef0c2" containerID="59408663b67b657311712ee8df059a93b279d9cbd5b30478815367d0bb210462" exitCode=0 Oct 03 18:38:36 crc kubenswrapper[4835]: I1003 18:38:36.090546 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pjjc" event={"ID":"2e15124f-a26d-4eac-a79f-7e54f31ef0c2","Type":"ContainerDied","Data":"59408663b67b657311712ee8df059a93b279d9cbd5b30478815367d0bb210462"} Oct 03 18:38:37 crc kubenswrapper[4835]: I1003 18:38:37.100940 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pjjc" event={"ID":"2e15124f-a26d-4eac-a79f-7e54f31ef0c2","Type":"ContainerStarted","Data":"9c54261c3bd8469e65f3382a37afa0fe696a150674f7f4d4c37c70fe7e321783"} Oct 03 18:38:37 crc kubenswrapper[4835]: I1003 18:38:37.126509 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8pjjc" podStartSLOduration=3.667099345 podStartE2EDuration="6.126491897s" podCreationTimestamp="2025-10-03 18:38:31 +0000 UTC" firstStartedPulling="2025-10-03 18:38:34.068175685 +0000 UTC m=+1455.784116557" lastFinishedPulling="2025-10-03 18:38:36.527568237 +0000 UTC m=+1458.243509109" observedRunningTime="2025-10-03 18:38:37.117883372 +0000 UTC m=+1458.833824244" watchObservedRunningTime="2025-10-03 18:38:37.126491897 +0000 UTC m=+1458.842432769" Oct 03 18:38:40 crc kubenswrapper[4835]: I1003 18:38:40.870029 4835 scope.go:117] "RemoveContainer" containerID="96b91c11670e6c312658d7ec0de4f3048c8396ee891f3797f4b740ebe3682a56" Oct 03 18:38:42 crc kubenswrapper[4835]: I1003 18:38:42.325153 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:42 crc kubenswrapper[4835]: I1003 18:38:42.325725 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:42 crc kubenswrapper[4835]: I1003 18:38:42.377584 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:43 crc kubenswrapper[4835]: I1003 18:38:43.200335 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:43 crc kubenswrapper[4835]: I1003 18:38:43.243361 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8pjjc"] Oct 03 18:38:45 crc kubenswrapper[4835]: I1003 18:38:45.176230 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8pjjc" podUID="2e15124f-a26d-4eac-a79f-7e54f31ef0c2" containerName="registry-server" containerID="cri-o://9c54261c3bd8469e65f3382a37afa0fe696a150674f7f4d4c37c70fe7e321783" gracePeriod=2 Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.137991 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.186367 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e15124f-a26d-4eac-a79f-7e54f31ef0c2" containerID="9c54261c3bd8469e65f3382a37afa0fe696a150674f7f4d4c37c70fe7e321783" exitCode=0 Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.186409 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pjjc" event={"ID":"2e15124f-a26d-4eac-a79f-7e54f31ef0c2","Type":"ContainerDied","Data":"9c54261c3bd8469e65f3382a37afa0fe696a150674f7f4d4c37c70fe7e321783"} Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.186446 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pjjc" event={"ID":"2e15124f-a26d-4eac-a79f-7e54f31ef0c2","Type":"ContainerDied","Data":"c936975bcf2d5459ba0444a9c46e6765849c9525024b17290271e919f9193416"} Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.186456 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pjjc" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.186468 4835 scope.go:117] "RemoveContainer" containerID="9c54261c3bd8469e65f3382a37afa0fe696a150674f7f4d4c37c70fe7e321783" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.191015 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2r4z\" (UniqueName: \"kubernetes.io/projected/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-kube-api-access-q2r4z\") pod \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\" (UID: \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\") " Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.191090 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-catalog-content\") pod \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\" (UID: \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\") " Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.191161 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-utilities\") pod \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\" (UID: \"2e15124f-a26d-4eac-a79f-7e54f31ef0c2\") " Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.192544 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-utilities" (OuterVolumeSpecName: "utilities") pod "2e15124f-a26d-4eac-a79f-7e54f31ef0c2" (UID: "2e15124f-a26d-4eac-a79f-7e54f31ef0c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.197285 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-kube-api-access-q2r4z" (OuterVolumeSpecName: "kube-api-access-q2r4z") pod "2e15124f-a26d-4eac-a79f-7e54f31ef0c2" (UID: "2e15124f-a26d-4eac-a79f-7e54f31ef0c2"). InnerVolumeSpecName "kube-api-access-q2r4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.208912 4835 scope.go:117] "RemoveContainer" containerID="59408663b67b657311712ee8df059a93b279d9cbd5b30478815367d0bb210462" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.235360 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e15124f-a26d-4eac-a79f-7e54f31ef0c2" (UID: "2e15124f-a26d-4eac-a79f-7e54f31ef0c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.255564 4835 scope.go:117] "RemoveContainer" containerID="da8ad492ebad1d0e203802aac793bbdc82e5d63c6e56e0600f0dab1883cc4f2f" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.292559 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2r4z\" (UniqueName: \"kubernetes.io/projected/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-kube-api-access-q2r4z\") on node \"crc\" DevicePath \"\"" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.292599 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.292610 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e15124f-a26d-4eac-a79f-7e54f31ef0c2-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.300842 4835 scope.go:117] "RemoveContainer" containerID="9c54261c3bd8469e65f3382a37afa0fe696a150674f7f4d4c37c70fe7e321783" Oct 03 18:38:46 crc kubenswrapper[4835]: E1003 18:38:46.301403 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c54261c3bd8469e65f3382a37afa0fe696a150674f7f4d4c37c70fe7e321783\": container with ID starting with 9c54261c3bd8469e65f3382a37afa0fe696a150674f7f4d4c37c70fe7e321783 not found: ID does not exist" containerID="9c54261c3bd8469e65f3382a37afa0fe696a150674f7f4d4c37c70fe7e321783" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.301439 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c54261c3bd8469e65f3382a37afa0fe696a150674f7f4d4c37c70fe7e321783"} err="failed to get container status \"9c54261c3bd8469e65f3382a37afa0fe696a150674f7f4d4c37c70fe7e321783\": rpc error: code = NotFound desc = could not find container \"9c54261c3bd8469e65f3382a37afa0fe696a150674f7f4d4c37c70fe7e321783\": container with ID starting with 9c54261c3bd8469e65f3382a37afa0fe696a150674f7f4d4c37c70fe7e321783 not found: ID does not exist" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.301461 4835 scope.go:117] "RemoveContainer" containerID="59408663b67b657311712ee8df059a93b279d9cbd5b30478815367d0bb210462" Oct 03 18:38:46 crc kubenswrapper[4835]: E1003 18:38:46.301842 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59408663b67b657311712ee8df059a93b279d9cbd5b30478815367d0bb210462\": container with ID starting with 59408663b67b657311712ee8df059a93b279d9cbd5b30478815367d0bb210462 not found: ID does not exist" containerID="59408663b67b657311712ee8df059a93b279d9cbd5b30478815367d0bb210462" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.301909 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59408663b67b657311712ee8df059a93b279d9cbd5b30478815367d0bb210462"} err="failed to get container status \"59408663b67b657311712ee8df059a93b279d9cbd5b30478815367d0bb210462\": rpc error: code = NotFound desc = could not find container \"59408663b67b657311712ee8df059a93b279d9cbd5b30478815367d0bb210462\": container with ID starting with 59408663b67b657311712ee8df059a93b279d9cbd5b30478815367d0bb210462 not found: ID does not exist" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.301960 4835 scope.go:117] "RemoveContainer" containerID="da8ad492ebad1d0e203802aac793bbdc82e5d63c6e56e0600f0dab1883cc4f2f" Oct 03 18:38:46 crc kubenswrapper[4835]: E1003 18:38:46.302410 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da8ad492ebad1d0e203802aac793bbdc82e5d63c6e56e0600f0dab1883cc4f2f\": container with ID starting with da8ad492ebad1d0e203802aac793bbdc82e5d63c6e56e0600f0dab1883cc4f2f not found: ID does not exist" containerID="da8ad492ebad1d0e203802aac793bbdc82e5d63c6e56e0600f0dab1883cc4f2f" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.302444 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da8ad492ebad1d0e203802aac793bbdc82e5d63c6e56e0600f0dab1883cc4f2f"} err="failed to get container status \"da8ad492ebad1d0e203802aac793bbdc82e5d63c6e56e0600f0dab1883cc4f2f\": rpc error: code = NotFound desc = could not find container \"da8ad492ebad1d0e203802aac793bbdc82e5d63c6e56e0600f0dab1883cc4f2f\": container with ID starting with da8ad492ebad1d0e203802aac793bbdc82e5d63c6e56e0600f0dab1883cc4f2f not found: ID does not exist" Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.524448 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8pjjc"] Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.533356 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8pjjc"] Oct 03 18:38:46 crc kubenswrapper[4835]: I1003 18:38:46.894173 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e15124f-a26d-4eac-a79f-7e54f31ef0c2" path="/var/lib/kubelet/pods/2e15124f-a26d-4eac-a79f-7e54f31ef0c2/volumes" Oct 03 18:39:05 crc kubenswrapper[4835]: I1003 18:39:05.358145 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:39:05 crc kubenswrapper[4835]: I1003 18:39:05.359758 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:39:35 crc kubenswrapper[4835]: I1003 18:39:35.358594 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:39:35 crc kubenswrapper[4835]: I1003 18:39:35.359190 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:40:05 crc kubenswrapper[4835]: I1003 18:40:05.358560 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:40:05 crc kubenswrapper[4835]: I1003 18:40:05.359182 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:40:05 crc kubenswrapper[4835]: I1003 18:40:05.359235 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:40:05 crc kubenswrapper[4835]: I1003 18:40:05.360085 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 18:40:05 crc kubenswrapper[4835]: I1003 18:40:05.360158 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" gracePeriod=600 Oct 03 18:40:05 crc kubenswrapper[4835]: E1003 18:40:05.484375 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:40:05 crc kubenswrapper[4835]: I1003 18:40:05.970507 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" exitCode=0 Oct 03 18:40:05 crc kubenswrapper[4835]: I1003 18:40:05.970560 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce"} Oct 03 18:40:05 crc kubenswrapper[4835]: I1003 18:40:05.970869 4835 scope.go:117] "RemoveContainer" containerID="f6256e41e920d222ccf54930d399b21cd032b6d7ace88624e5e3fa3510d642ea" Oct 03 18:40:05 crc kubenswrapper[4835]: I1003 18:40:05.971554 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:40:05 crc kubenswrapper[4835]: E1003 18:40:05.971888 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:40:17 crc kubenswrapper[4835]: I1003 18:40:17.877482 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:40:17 crc kubenswrapper[4835]: E1003 18:40:17.878231 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:40:29 crc kubenswrapper[4835]: I1003 18:40:29.878462 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:40:29 crc kubenswrapper[4835]: E1003 18:40:29.879403 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:40:44 crc kubenswrapper[4835]: I1003 18:40:44.877303 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:40:44 crc kubenswrapper[4835]: E1003 18:40:44.878156 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:40:52 crc kubenswrapper[4835]: I1003 18:40:52.466248 4835 generic.go:334] "Generic (PLEG): container finished" podID="99ce37b7-29b9-44ed-a066-bc503ae35b61" containerID="978f1be6ffa7a43cd1652de497582733a750323e90ccefd50633bf74c32d7471" exitCode=0 Oct 03 18:40:52 crc kubenswrapper[4835]: I1003 18:40:52.466339 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" event={"ID":"99ce37b7-29b9-44ed-a066-bc503ae35b61","Type":"ContainerDied","Data":"978f1be6ffa7a43cd1652de497582733a750323e90ccefd50633bf74c32d7471"} Oct 03 18:40:53 crc kubenswrapper[4835]: I1003 18:40:53.948268 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.068342 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzrml\" (UniqueName: \"kubernetes.io/projected/99ce37b7-29b9-44ed-a066-bc503ae35b61-kube-api-access-pzrml\") pod \"99ce37b7-29b9-44ed-a066-bc503ae35b61\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.069107 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-ssh-key\") pod \"99ce37b7-29b9-44ed-a066-bc503ae35b61\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.069257 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-bootstrap-combined-ca-bundle\") pod \"99ce37b7-29b9-44ed-a066-bc503ae35b61\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.069457 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-inventory\") pod \"99ce37b7-29b9-44ed-a066-bc503ae35b61\" (UID: \"99ce37b7-29b9-44ed-a066-bc503ae35b61\") " Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.074048 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ce37b7-29b9-44ed-a066-bc503ae35b61-kube-api-access-pzrml" (OuterVolumeSpecName: "kube-api-access-pzrml") pod "99ce37b7-29b9-44ed-a066-bc503ae35b61" (UID: "99ce37b7-29b9-44ed-a066-bc503ae35b61"). InnerVolumeSpecName "kube-api-access-pzrml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.093562 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "99ce37b7-29b9-44ed-a066-bc503ae35b61" (UID: "99ce37b7-29b9-44ed-a066-bc503ae35b61"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.099702 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "99ce37b7-29b9-44ed-a066-bc503ae35b61" (UID: "99ce37b7-29b9-44ed-a066-bc503ae35b61"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.102312 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-inventory" (OuterVolumeSpecName: "inventory") pod "99ce37b7-29b9-44ed-a066-bc503ae35b61" (UID: "99ce37b7-29b9-44ed-a066-bc503ae35b61"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.173154 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.173200 4835 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.173214 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99ce37b7-29b9-44ed-a066-bc503ae35b61-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.173223 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzrml\" (UniqueName: \"kubernetes.io/projected/99ce37b7-29b9-44ed-a066-bc503ae35b61-kube-api-access-pzrml\") on node \"crc\" DevicePath \"\"" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.488610 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" event={"ID":"99ce37b7-29b9-44ed-a066-bc503ae35b61","Type":"ContainerDied","Data":"e8f0077a8d82c6b45b764688b5f246207206f14a1f200c42599b5b1b7ca585d5"} Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.488647 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8f0077a8d82c6b45b764688b5f246207206f14a1f200c42599b5b1b7ca585d5" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.488699 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.590852 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm"] Oct 03 18:40:54 crc kubenswrapper[4835]: E1003 18:40:54.591261 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e15124f-a26d-4eac-a79f-7e54f31ef0c2" containerName="extract-utilities" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.591281 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e15124f-a26d-4eac-a79f-7e54f31ef0c2" containerName="extract-utilities" Oct 03 18:40:54 crc kubenswrapper[4835]: E1003 18:40:54.591297 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e15124f-a26d-4eac-a79f-7e54f31ef0c2" containerName="registry-server" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.591304 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e15124f-a26d-4eac-a79f-7e54f31ef0c2" containerName="registry-server" Oct 03 18:40:54 crc kubenswrapper[4835]: E1003 18:40:54.591336 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e15124f-a26d-4eac-a79f-7e54f31ef0c2" containerName="extract-content" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.591342 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e15124f-a26d-4eac-a79f-7e54f31ef0c2" containerName="extract-content" Oct 03 18:40:54 crc kubenswrapper[4835]: E1003 18:40:54.591352 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ce37b7-29b9-44ed-a066-bc503ae35b61" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.591362 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ce37b7-29b9-44ed-a066-bc503ae35b61" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.591554 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e15124f-a26d-4eac-a79f-7e54f31ef0c2" containerName="registry-server" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.591586 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ce37b7-29b9-44ed-a066-bc503ae35b61" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.592336 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.594362 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.596537 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.596713 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.596889 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.603716 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm"] Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.784395 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/040bbc22-68da-4384-981c-4b7716352d49-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm\" (UID: \"040bbc22-68da-4384-981c-4b7716352d49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.784491 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mg6q\" (UniqueName: \"kubernetes.io/projected/040bbc22-68da-4384-981c-4b7716352d49-kube-api-access-2mg6q\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm\" (UID: \"040bbc22-68da-4384-981c-4b7716352d49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.784528 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/040bbc22-68da-4384-981c-4b7716352d49-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm\" (UID: \"040bbc22-68da-4384-981c-4b7716352d49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.886048 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mg6q\" (UniqueName: \"kubernetes.io/projected/040bbc22-68da-4384-981c-4b7716352d49-kube-api-access-2mg6q\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm\" (UID: \"040bbc22-68da-4384-981c-4b7716352d49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.886116 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/040bbc22-68da-4384-981c-4b7716352d49-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm\" (UID: \"040bbc22-68da-4384-981c-4b7716352d49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.886224 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/040bbc22-68da-4384-981c-4b7716352d49-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm\" (UID: \"040bbc22-68da-4384-981c-4b7716352d49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.889863 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/040bbc22-68da-4384-981c-4b7716352d49-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm\" (UID: \"040bbc22-68da-4384-981c-4b7716352d49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.890098 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/040bbc22-68da-4384-981c-4b7716352d49-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm\" (UID: \"040bbc22-68da-4384-981c-4b7716352d49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.903455 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mg6q\" (UniqueName: \"kubernetes.io/projected/040bbc22-68da-4384-981c-4b7716352d49-kube-api-access-2mg6q\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm\" (UID: \"040bbc22-68da-4384-981c-4b7716352d49\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" Oct 03 18:40:54 crc kubenswrapper[4835]: I1003 18:40:54.909357 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" Oct 03 18:40:55 crc kubenswrapper[4835]: I1003 18:40:55.412256 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm"] Oct 03 18:40:55 crc kubenswrapper[4835]: I1003 18:40:55.420749 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 18:40:55 crc kubenswrapper[4835]: I1003 18:40:55.505823 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" event={"ID":"040bbc22-68da-4384-981c-4b7716352d49","Type":"ContainerStarted","Data":"85f642186d22a0508ae35565d5a586577006b7becb26a0b9cc629df30c2cea4d"} Oct 03 18:40:56 crc kubenswrapper[4835]: I1003 18:40:56.520764 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" event={"ID":"040bbc22-68da-4384-981c-4b7716352d49","Type":"ContainerStarted","Data":"17025b9a9cc00e4293458a5fde03f93739d6fc513fcd89f64c7969466555c89a"} Oct 03 18:40:56 crc kubenswrapper[4835]: I1003 18:40:56.547408 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" podStartSLOduration=2.132394648 podStartE2EDuration="2.547385331s" podCreationTimestamp="2025-10-03 18:40:54 +0000 UTC" firstStartedPulling="2025-10-03 18:40:55.420453872 +0000 UTC m=+1597.136394744" lastFinishedPulling="2025-10-03 18:40:55.835444565 +0000 UTC m=+1597.551385427" observedRunningTime="2025-10-03 18:40:56.541222157 +0000 UTC m=+1598.257163099" watchObservedRunningTime="2025-10-03 18:40:56.547385331 +0000 UTC m=+1598.263326203" Oct 03 18:40:58 crc kubenswrapper[4835]: I1003 18:40:58.885952 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:40:58 crc kubenswrapper[4835]: E1003 18:40:58.887867 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:41:06 crc kubenswrapper[4835]: I1003 18:41:06.049250 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-2qv8l"] Oct 03 18:41:06 crc kubenswrapper[4835]: I1003 18:41:06.061466 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-2qv8l"] Oct 03 18:41:06 crc kubenswrapper[4835]: I1003 18:41:06.070684 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-n5gjl"] Oct 03 18:41:06 crc kubenswrapper[4835]: I1003 18:41:06.084172 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-676cv"] Oct 03 18:41:06 crc kubenswrapper[4835]: I1003 18:41:06.097635 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-676cv"] Oct 03 18:41:06 crc kubenswrapper[4835]: I1003 18:41:06.110250 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-jfv8f"] Oct 03 18:41:06 crc kubenswrapper[4835]: I1003 18:41:06.121515 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-n5gjl"] Oct 03 18:41:06 crc kubenswrapper[4835]: I1003 18:41:06.132021 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-jfv8f"] Oct 03 18:41:06 crc kubenswrapper[4835]: I1003 18:41:06.894264 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c3b62d6-8a47-48b7-8bb7-ffc031a67eef" path="/var/lib/kubelet/pods/7c3b62d6-8a47-48b7-8bb7-ffc031a67eef/volumes" Oct 03 18:41:06 crc kubenswrapper[4835]: I1003 18:41:06.895950 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8906174e-7e3e-4c75-916d-b9dc0428ed75" path="/var/lib/kubelet/pods/8906174e-7e3e-4c75-916d-b9dc0428ed75/volumes" Oct 03 18:41:06 crc kubenswrapper[4835]: I1003 18:41:06.897212 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="991a2cf0-ae6d-4c94-8bc2-f0f605077c13" path="/var/lib/kubelet/pods/991a2cf0-ae6d-4c94-8bc2-f0f605077c13/volumes" Oct 03 18:41:06 crc kubenswrapper[4835]: I1003 18:41:06.897884 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef013310-2986-4594-a8a2-f9d0f9f686df" path="/var/lib/kubelet/pods/ef013310-2986-4594-a8a2-f9d0f9f686df/volumes" Oct 03 18:41:10 crc kubenswrapper[4835]: I1003 18:41:10.877392 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:41:10 crc kubenswrapper[4835]: E1003 18:41:10.877935 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:41:16 crc kubenswrapper[4835]: I1003 18:41:16.049175 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-8132-account-create-xwj7q"] Oct 03 18:41:16 crc kubenswrapper[4835]: I1003 18:41:16.060833 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-8132-account-create-xwj7q"] Oct 03 18:41:16 crc kubenswrapper[4835]: I1003 18:41:16.889994 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1f1068f-9964-41d4-909d-19cc6c035a73" path="/var/lib/kubelet/pods/a1f1068f-9964-41d4-909d-19cc6c035a73/volumes" Oct 03 18:41:23 crc kubenswrapper[4835]: I1003 18:41:23.029106 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-41e5-account-create-zlkgq"] Oct 03 18:41:23 crc kubenswrapper[4835]: I1003 18:41:23.039542 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-41e5-account-create-zlkgq"] Oct 03 18:41:24 crc kubenswrapper[4835]: I1003 18:41:24.887580 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5816f3db-fcba-4c2f-872e-0bdf53f7e7df" path="/var/lib/kubelet/pods/5816f3db-fcba-4c2f-872e-0bdf53f7e7df/volumes" Oct 03 18:41:25 crc kubenswrapper[4835]: I1003 18:41:25.024848 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-dd46-account-create-j255n"] Oct 03 18:41:25 crc kubenswrapper[4835]: I1003 18:41:25.034670 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c92d-account-create-2tkrd"] Oct 03 18:41:25 crc kubenswrapper[4835]: I1003 18:41:25.043289 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c92d-account-create-2tkrd"] Oct 03 18:41:25 crc kubenswrapper[4835]: I1003 18:41:25.050764 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-dd46-account-create-j255n"] Oct 03 18:41:25 crc kubenswrapper[4835]: I1003 18:41:25.877473 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:41:25 crc kubenswrapper[4835]: E1003 18:41:25.877896 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:41:26 crc kubenswrapper[4835]: I1003 18:41:26.887566 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31260cd4-00fe-424d-9111-32de1fcae207" path="/var/lib/kubelet/pods/31260cd4-00fe-424d-9111-32de1fcae207/volumes" Oct 03 18:41:26 crc kubenswrapper[4835]: I1003 18:41:26.889435 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b" path="/var/lib/kubelet/pods/9dfe15e4-e9d4-45cb-a7ce-9377e0666f7b/volumes" Oct 03 18:41:36 crc kubenswrapper[4835]: I1003 18:41:36.876646 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:41:36 crc kubenswrapper[4835]: E1003 18:41:36.877460 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.045384 4835 scope.go:117] "RemoveContainer" containerID="f1a83e0acbe67fc7ab8a5d9446116938d8f5efa57d8c9a97bbc06e20c310f635" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.067513 4835 scope.go:117] "RemoveContainer" containerID="ea0f25410c43d25ed9f3938f40875dc0392ab904da83c5fd6f8db244733fc46c" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.086342 4835 scope.go:117] "RemoveContainer" containerID="e187d5c9482b2da051394b1d20d01dc84c62e84d786b0afc593ee9d21083e20c" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.133695 4835 scope.go:117] "RemoveContainer" containerID="4b3da114b9ee0fa0989b751a5259b35b3d3cb7246905ed123e47ecc35e62d80d" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.151898 4835 scope.go:117] "RemoveContainer" containerID="449f4e38da63229c20e564420d06245b82718e77ca552ccf9765b7c84cae3e3e" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.202391 4835 scope.go:117] "RemoveContainer" containerID="7489ec8d94a3bcbb02f511b57c58518fbc1e5a9783369f1dbc69e8bb2c3dc3ba" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.245826 4835 scope.go:117] "RemoveContainer" containerID="d6a7a5a8a0115496d28d5035ccaa4405bf14455163d8752cea7f1d893f5d5e85" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.297749 4835 scope.go:117] "RemoveContainer" containerID="f872b59b883c8e4f20f046bbe9244e290359ac55a59d667c9bf1eedde534a9d4" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.317070 4835 scope.go:117] "RemoveContainer" containerID="eda660eb2c65a655847c41b971837f189a5f1c84018b25b336b7f65aebd0334c" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.363691 4835 scope.go:117] "RemoveContainer" containerID="5ada613162cff0cebd48cff9da0bb102dfd8a8bc1ef58c7d504870a3bc5e235f" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.392659 4835 scope.go:117] "RemoveContainer" containerID="2eee8ab267f08fb7635468a90b258465d6557cb2b615418f5f894c8d3cba641e" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.413399 4835 scope.go:117] "RemoveContainer" containerID="48f1a858f5c03455f6fe8336219583cc3313591c545d9e6af0d4648988eee2e9" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.430623 4835 scope.go:117] "RemoveContainer" containerID="490642377d84e04f9b33a361c5af3b4470568cbcb85ec0f487211ece75e70252" Oct 03 18:41:41 crc kubenswrapper[4835]: I1003 18:41:41.454881 4835 scope.go:117] "RemoveContainer" containerID="d3f58fdab93f62baa6af045ff1a0c57a51e2e37dc61cdde574d4ff5feb2a0b15" Oct 03 18:41:47 crc kubenswrapper[4835]: I1003 18:41:47.880240 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:41:47 crc kubenswrapper[4835]: E1003 18:41:47.880979 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:41:51 crc kubenswrapper[4835]: I1003 18:41:51.049139 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-wp92w"] Oct 03 18:41:51 crc kubenswrapper[4835]: I1003 18:41:51.064296 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-dk4ps"] Oct 03 18:41:51 crc kubenswrapper[4835]: I1003 18:41:51.073957 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-dk4ps"] Oct 03 18:41:51 crc kubenswrapper[4835]: I1003 18:41:51.082501 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-wp92w"] Oct 03 18:41:52 crc kubenswrapper[4835]: I1003 18:41:52.887161 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9162685-16b5-45ab-824b-6d5cbd3e3d98" path="/var/lib/kubelet/pods/a9162685-16b5-45ab-824b-6d5cbd3e3d98/volumes" Oct 03 18:41:52 crc kubenswrapper[4835]: I1003 18:41:52.887991 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc7fc2c2-1936-44ed-ba40-1baa10908df0" path="/var/lib/kubelet/pods/bc7fc2c2-1936-44ed-ba40-1baa10908df0/volumes" Oct 03 18:41:54 crc kubenswrapper[4835]: I1003 18:41:54.030060 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-t6nss"] Oct 03 18:41:54 crc kubenswrapper[4835]: I1003 18:41:54.040688 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-t6nss"] Oct 03 18:41:54 crc kubenswrapper[4835]: I1003 18:41:54.887182 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1223b6d4-6430-46e3-8df7-de4952fe563e" path="/var/lib/kubelet/pods/1223b6d4-6430-46e3-8df7-de4952fe563e/volumes" Oct 03 18:42:01 crc kubenswrapper[4835]: I1003 18:42:01.023765 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c692-account-create-wrwnk"] Oct 03 18:42:01 crc kubenswrapper[4835]: I1003 18:42:01.031766 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c692-account-create-wrwnk"] Oct 03 18:42:01 crc kubenswrapper[4835]: I1003 18:42:01.877131 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:42:01 crc kubenswrapper[4835]: E1003 18:42:01.877453 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:42:02 crc kubenswrapper[4835]: I1003 18:42:02.025692 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d472-account-create-4n9kl"] Oct 03 18:42:02 crc kubenswrapper[4835]: I1003 18:42:02.034997 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d472-account-create-4n9kl"] Oct 03 18:42:02 crc kubenswrapper[4835]: I1003 18:42:02.888724 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45a00bc2-b857-4530-8823-66a7204feade" path="/var/lib/kubelet/pods/45a00bc2-b857-4530-8823-66a7204feade/volumes" Oct 03 18:42:02 crc kubenswrapper[4835]: I1003 18:42:02.889392 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e42d7b31-acc0-4ed1-87f5-3a2fad499556" path="/var/lib/kubelet/pods/e42d7b31-acc0-4ed1-87f5-3a2fad499556/volumes" Oct 03 18:42:06 crc kubenswrapper[4835]: I1003 18:42:06.031617 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-9bqlb"] Oct 03 18:42:06 crc kubenswrapper[4835]: I1003 18:42:06.039860 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-9bqlb"] Oct 03 18:42:06 crc kubenswrapper[4835]: I1003 18:42:06.887779 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f585da2-5074-43be-8260-e854b0e3b1a6" path="/var/lib/kubelet/pods/2f585da2-5074-43be-8260-e854b0e3b1a6/volumes" Oct 03 18:42:07 crc kubenswrapper[4835]: I1003 18:42:07.029087 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-g9qx7"] Oct 03 18:42:07 crc kubenswrapper[4835]: I1003 18:42:07.038507 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-g9qx7"] Oct 03 18:42:08 crc kubenswrapper[4835]: I1003 18:42:08.890586 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07b0cbfd-5980-4408-927d-6e8b474a09a7" path="/var/lib/kubelet/pods/07b0cbfd-5980-4408-927d-6e8b474a09a7/volumes" Oct 03 18:42:09 crc kubenswrapper[4835]: I1003 18:42:09.046296 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9jg6f"] Oct 03 18:42:09 crc kubenswrapper[4835]: I1003 18:42:09.060485 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9jg6f"] Oct 03 18:42:10 crc kubenswrapper[4835]: I1003 18:42:10.888459 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94622de9-5048-41be-875b-dc37acc7eba4" path="/var/lib/kubelet/pods/94622de9-5048-41be-875b-dc37acc7eba4/volumes" Oct 03 18:42:13 crc kubenswrapper[4835]: I1003 18:42:13.028220 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c3bd-account-create-cvkm5"] Oct 03 18:42:13 crc kubenswrapper[4835]: I1003 18:42:13.037054 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c3bd-account-create-cvkm5"] Oct 03 18:42:14 crc kubenswrapper[4835]: I1003 18:42:14.888191 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87bc8f1a-de34-4448-bc20-10e5b92907e6" path="/var/lib/kubelet/pods/87bc8f1a-de34-4448-bc20-10e5b92907e6/volumes" Oct 03 18:42:16 crc kubenswrapper[4835]: I1003 18:42:16.876854 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:42:16 crc kubenswrapper[4835]: E1003 18:42:16.877433 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:42:28 crc kubenswrapper[4835]: I1003 18:42:28.883754 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:42:28 crc kubenswrapper[4835]: E1003 18:42:28.884426 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:42:40 crc kubenswrapper[4835]: I1003 18:42:40.877134 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:42:40 crc kubenswrapper[4835]: E1003 18:42:40.877814 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:42:41 crc kubenswrapper[4835]: I1003 18:42:41.638000 4835 scope.go:117] "RemoveContainer" containerID="341b0758340c4d6c6c5b16d296fefbaf5bbab98888255cd0a922ee3e3471be20" Oct 03 18:42:41 crc kubenswrapper[4835]: I1003 18:42:41.658552 4835 scope.go:117] "RemoveContainer" containerID="097f5bff0c61523a37b6b642ddae3aa39eb0c0ed802eb3f58b26bd3f4cb64fd2" Oct 03 18:42:41 crc kubenswrapper[4835]: I1003 18:42:41.708714 4835 scope.go:117] "RemoveContainer" containerID="42490ccf585ca2dfb4dabd7c5c3d9cd9e8a3b01cface0dd3b9c29112b1c707c1" Oct 03 18:42:41 crc kubenswrapper[4835]: I1003 18:42:41.760645 4835 scope.go:117] "RemoveContainer" containerID="8cebce2b888e0113dc21d943570529659909ea96b53b980f985eb170ed64f72a" Oct 03 18:42:41 crc kubenswrapper[4835]: I1003 18:42:41.805767 4835 scope.go:117] "RemoveContainer" containerID="7feb2944e524dc8a34f99172d3aae778a9ac99aa266e0a128498863dac0fc279" Oct 03 18:42:41 crc kubenswrapper[4835]: I1003 18:42:41.845046 4835 scope.go:117] "RemoveContainer" containerID="078e39ddc273d8c4eb796cdaccebabc4766b1579c5e63c58c7ffbeeb3bad5f11" Oct 03 18:42:41 crc kubenswrapper[4835]: I1003 18:42:41.889336 4835 scope.go:117] "RemoveContainer" containerID="fa936e2298c5f749805b7e599706afbd4bf89fef32ea343742e3924c933db470" Oct 03 18:42:41 crc kubenswrapper[4835]: I1003 18:42:41.909132 4835 scope.go:117] "RemoveContainer" containerID="faf723616274a65eabc0b792453f25742b5b77ee440a051863d30601fd1f9525" Oct 03 18:42:41 crc kubenswrapper[4835]: I1003 18:42:41.943535 4835 scope.go:117] "RemoveContainer" containerID="f3a8064d4600d03b601b52bd3666668b95f637711d2745fb936ce6168a9a53fd" Oct 03 18:42:42 crc kubenswrapper[4835]: I1003 18:42:42.512590 4835 generic.go:334] "Generic (PLEG): container finished" podID="040bbc22-68da-4384-981c-4b7716352d49" containerID="17025b9a9cc00e4293458a5fde03f93739d6fc513fcd89f64c7969466555c89a" exitCode=0 Oct 03 18:42:42 crc kubenswrapper[4835]: I1003 18:42:42.512788 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" event={"ID":"040bbc22-68da-4384-981c-4b7716352d49","Type":"ContainerDied","Data":"17025b9a9cc00e4293458a5fde03f93739d6fc513fcd89f64c7969466555c89a"} Oct 03 18:42:43 crc kubenswrapper[4835]: I1003 18:42:43.921817 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.063768 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mg6q\" (UniqueName: \"kubernetes.io/projected/040bbc22-68da-4384-981c-4b7716352d49-kube-api-access-2mg6q\") pod \"040bbc22-68da-4384-981c-4b7716352d49\" (UID: \"040bbc22-68da-4384-981c-4b7716352d49\") " Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.064010 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/040bbc22-68da-4384-981c-4b7716352d49-ssh-key\") pod \"040bbc22-68da-4384-981c-4b7716352d49\" (UID: \"040bbc22-68da-4384-981c-4b7716352d49\") " Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.064083 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/040bbc22-68da-4384-981c-4b7716352d49-inventory\") pod \"040bbc22-68da-4384-981c-4b7716352d49\" (UID: \"040bbc22-68da-4384-981c-4b7716352d49\") " Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.069334 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/040bbc22-68da-4384-981c-4b7716352d49-kube-api-access-2mg6q" (OuterVolumeSpecName: "kube-api-access-2mg6q") pod "040bbc22-68da-4384-981c-4b7716352d49" (UID: "040bbc22-68da-4384-981c-4b7716352d49"). InnerVolumeSpecName "kube-api-access-2mg6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.110569 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/040bbc22-68da-4384-981c-4b7716352d49-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "040bbc22-68da-4384-981c-4b7716352d49" (UID: "040bbc22-68da-4384-981c-4b7716352d49"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.111875 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/040bbc22-68da-4384-981c-4b7716352d49-inventory" (OuterVolumeSpecName: "inventory") pod "040bbc22-68da-4384-981c-4b7716352d49" (UID: "040bbc22-68da-4384-981c-4b7716352d49"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.166170 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/040bbc22-68da-4384-981c-4b7716352d49-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.166204 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/040bbc22-68da-4384-981c-4b7716352d49-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.166216 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mg6q\" (UniqueName: \"kubernetes.io/projected/040bbc22-68da-4384-981c-4b7716352d49-kube-api-access-2mg6q\") on node \"crc\" DevicePath \"\"" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.531737 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" event={"ID":"040bbc22-68da-4384-981c-4b7716352d49","Type":"ContainerDied","Data":"85f642186d22a0508ae35565d5a586577006b7becb26a0b9cc629df30c2cea4d"} Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.531783 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85f642186d22a0508ae35565d5a586577006b7becb26a0b9cc629df30c2cea4d" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.531801 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.612388 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d"] Oct 03 18:42:44 crc kubenswrapper[4835]: E1003 18:42:44.612833 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040bbc22-68da-4384-981c-4b7716352d49" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.612853 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="040bbc22-68da-4384-981c-4b7716352d49" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.613043 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="040bbc22-68da-4384-981c-4b7716352d49" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.613763 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.615992 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.616301 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.616670 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.619383 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.620728 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d"] Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.780147 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d0143f0-3796-4a3e-985b-7c240fd0158b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d\" (UID: \"2d0143f0-3796-4a3e-985b-7c240fd0158b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.780237 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d0143f0-3796-4a3e-985b-7c240fd0158b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d\" (UID: \"2d0143f0-3796-4a3e-985b-7c240fd0158b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.780565 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wxmb\" (UniqueName: \"kubernetes.io/projected/2d0143f0-3796-4a3e-985b-7c240fd0158b-kube-api-access-8wxmb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d\" (UID: \"2d0143f0-3796-4a3e-985b-7c240fd0158b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.882979 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d0143f0-3796-4a3e-985b-7c240fd0158b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d\" (UID: \"2d0143f0-3796-4a3e-985b-7c240fd0158b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.883149 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d0143f0-3796-4a3e-985b-7c240fd0158b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d\" (UID: \"2d0143f0-3796-4a3e-985b-7c240fd0158b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.883318 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wxmb\" (UniqueName: \"kubernetes.io/projected/2d0143f0-3796-4a3e-985b-7c240fd0158b-kube-api-access-8wxmb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d\" (UID: \"2d0143f0-3796-4a3e-985b-7c240fd0158b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.887852 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d0143f0-3796-4a3e-985b-7c240fd0158b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d\" (UID: \"2d0143f0-3796-4a3e-985b-7c240fd0158b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.887854 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d0143f0-3796-4a3e-985b-7c240fd0158b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d\" (UID: \"2d0143f0-3796-4a3e-985b-7c240fd0158b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.909137 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wxmb\" (UniqueName: \"kubernetes.io/projected/2d0143f0-3796-4a3e-985b-7c240fd0158b-kube-api-access-8wxmb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d\" (UID: \"2d0143f0-3796-4a3e-985b-7c240fd0158b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" Oct 03 18:42:44 crc kubenswrapper[4835]: I1003 18:42:44.932209 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" Oct 03 18:42:45 crc kubenswrapper[4835]: I1003 18:42:45.416657 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d"] Oct 03 18:42:45 crc kubenswrapper[4835]: I1003 18:42:45.545155 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" event={"ID":"2d0143f0-3796-4a3e-985b-7c240fd0158b","Type":"ContainerStarted","Data":"20a85aa195ae1402c1b361d97187de2ca10b8b297f7bc35b352fb8baf2cc2058"} Oct 03 18:42:46 crc kubenswrapper[4835]: I1003 18:42:46.553803 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" event={"ID":"2d0143f0-3796-4a3e-985b-7c240fd0158b","Type":"ContainerStarted","Data":"2e789836b5e487178a6c86e33e812ddbfb273dcfd5d29882ddd3e1464073845b"} Oct 03 18:42:46 crc kubenswrapper[4835]: I1003 18:42:46.573318 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" podStartSLOduration=2.050848918 podStartE2EDuration="2.573301583s" podCreationTimestamp="2025-10-03 18:42:44 +0000 UTC" firstStartedPulling="2025-10-03 18:42:45.42333573 +0000 UTC m=+1707.139276602" lastFinishedPulling="2025-10-03 18:42:45.945788395 +0000 UTC m=+1707.661729267" observedRunningTime="2025-10-03 18:42:46.56626232 +0000 UTC m=+1708.282203202" watchObservedRunningTime="2025-10-03 18:42:46.573301583 +0000 UTC m=+1708.289242455" Oct 03 18:42:52 crc kubenswrapper[4835]: I1003 18:42:52.036225 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-xkgv4"] Oct 03 18:42:52 crc kubenswrapper[4835]: I1003 18:42:52.044578 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-fz6rs"] Oct 03 18:42:52 crc kubenswrapper[4835]: I1003 18:42:52.051846 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-fz6rs"] Oct 03 18:42:52 crc kubenswrapper[4835]: I1003 18:42:52.060091 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-t2cnf"] Oct 03 18:42:52 crc kubenswrapper[4835]: I1003 18:42:52.067402 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-t2cnf"] Oct 03 18:42:52 crc kubenswrapper[4835]: I1003 18:42:52.074466 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-xkgv4"] Oct 03 18:42:52 crc kubenswrapper[4835]: I1003 18:42:52.890018 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a" path="/var/lib/kubelet/pods/4a4da7b3-24ff-4f0a-8fd5-95e81f4e5d5a/volumes" Oct 03 18:42:52 crc kubenswrapper[4835]: I1003 18:42:52.890899 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="709d622b-7993-4d18-8185-10b4f1c81d79" path="/var/lib/kubelet/pods/709d622b-7993-4d18-8185-10b4f1c81d79/volumes" Oct 03 18:42:52 crc kubenswrapper[4835]: I1003 18:42:52.891595 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddee52bd-4539-46b6-a51f-50fe9278665a" path="/var/lib/kubelet/pods/ddee52bd-4539-46b6-a51f-50fe9278665a/volumes" Oct 03 18:42:55 crc kubenswrapper[4835]: I1003 18:42:55.877526 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:42:55 crc kubenswrapper[4835]: E1003 18:42:55.878113 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:43:08 crc kubenswrapper[4835]: I1003 18:43:08.029480 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-dpk2w"] Oct 03 18:43:08 crc kubenswrapper[4835]: I1003 18:43:08.038174 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-dpk2w"] Oct 03 18:43:08 crc kubenswrapper[4835]: I1003 18:43:08.889739 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="705966b1-0d0b-4c12-9cc1-830277fcf80c" path="/var/lib/kubelet/pods/705966b1-0d0b-4c12-9cc1-830277fcf80c/volumes" Oct 03 18:43:09 crc kubenswrapper[4835]: I1003 18:43:09.877637 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:43:09 crc kubenswrapper[4835]: E1003 18:43:09.878355 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:43:16 crc kubenswrapper[4835]: I1003 18:43:16.037954 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-d274k"] Oct 03 18:43:16 crc kubenswrapper[4835]: I1003 18:43:16.046628 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-d274k"] Oct 03 18:43:16 crc kubenswrapper[4835]: I1003 18:43:16.891584 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f142f3b-9cce-451e-82b0-bfdac3ec661c" path="/var/lib/kubelet/pods/4f142f3b-9cce-451e-82b0-bfdac3ec661c/volumes" Oct 03 18:43:24 crc kubenswrapper[4835]: I1003 18:43:24.877479 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:43:24 crc kubenswrapper[4835]: E1003 18:43:24.878463 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:43:39 crc kubenswrapper[4835]: I1003 18:43:39.877633 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:43:39 crc kubenswrapper[4835]: E1003 18:43:39.878803 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:43:42 crc kubenswrapper[4835]: I1003 18:43:42.100711 4835 scope.go:117] "RemoveContainer" containerID="e80a414b0a4a919d1e0d59a1d6d41f19a37acaf0b810e3050b6700b1ac80c2cd" Oct 03 18:43:42 crc kubenswrapper[4835]: I1003 18:43:42.124059 4835 scope.go:117] "RemoveContainer" containerID="612fafc7ff04680a4fe6b0807c59155113a4aefe07daa8e9ca167e389427b7c1" Oct 03 18:43:42 crc kubenswrapper[4835]: I1003 18:43:42.176893 4835 scope.go:117] "RemoveContainer" containerID="5cd877aa81327a4fb6f95fd381cf6d8fa29e66b4e8e1d3ca29fd19134069036c" Oct 03 18:43:42 crc kubenswrapper[4835]: I1003 18:43:42.222163 4835 scope.go:117] "RemoveContainer" containerID="77b55f5478320213dad5b94639c84645d6fcdc901ad041694eeb730d015016f5" Oct 03 18:43:42 crc kubenswrapper[4835]: I1003 18:43:42.273918 4835 scope.go:117] "RemoveContainer" containerID="6ca2f11f0cbfad15130861cd6db25151abca97e492dafa18163bd5cbdc425bba" Oct 03 18:43:43 crc kubenswrapper[4835]: I1003 18:43:43.037850 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xw7fm"] Oct 03 18:43:43 crc kubenswrapper[4835]: I1003 18:43:43.046812 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-k2fcn"] Oct 03 18:43:43 crc kubenswrapper[4835]: I1003 18:43:43.056242 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xw7fm"] Oct 03 18:43:43 crc kubenswrapper[4835]: I1003 18:43:43.071786 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-k2fcn"] Oct 03 18:43:44 crc kubenswrapper[4835]: I1003 18:43:44.024317 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-cw6qt"] Oct 03 18:43:44 crc kubenswrapper[4835]: I1003 18:43:44.034887 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-cw6qt"] Oct 03 18:43:44 crc kubenswrapper[4835]: I1003 18:43:44.889512 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31632d24-3d3f-438c-a447-1a38f58ac87b" path="/var/lib/kubelet/pods/31632d24-3d3f-438c-a447-1a38f58ac87b/volumes" Oct 03 18:43:44 crc kubenswrapper[4835]: I1003 18:43:44.890030 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d39d05ac-72e8-4449-8b55-1b4126c64554" path="/var/lib/kubelet/pods/d39d05ac-72e8-4449-8b55-1b4126c64554/volumes" Oct 03 18:43:44 crc kubenswrapper[4835]: I1003 18:43:44.890634 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc2b577b-2a9c-4651-95f2-ad815b073b61" path="/var/lib/kubelet/pods/fc2b577b-2a9c-4651-95f2-ad815b073b61/volumes" Oct 03 18:43:54 crc kubenswrapper[4835]: I1003 18:43:54.877040 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:43:54 crc kubenswrapper[4835]: E1003 18:43:54.877911 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:43:57 crc kubenswrapper[4835]: I1003 18:43:57.166492 4835 generic.go:334] "Generic (PLEG): container finished" podID="2d0143f0-3796-4a3e-985b-7c240fd0158b" containerID="2e789836b5e487178a6c86e33e812ddbfb273dcfd5d29882ddd3e1464073845b" exitCode=0 Oct 03 18:43:57 crc kubenswrapper[4835]: I1003 18:43:57.166593 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" event={"ID":"2d0143f0-3796-4a3e-985b-7c240fd0158b","Type":"ContainerDied","Data":"2e789836b5e487178a6c86e33e812ddbfb273dcfd5d29882ddd3e1464073845b"} Oct 03 18:43:58 crc kubenswrapper[4835]: I1003 18:43:58.573482 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" Oct 03 18:43:58 crc kubenswrapper[4835]: I1003 18:43:58.730035 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wxmb\" (UniqueName: \"kubernetes.io/projected/2d0143f0-3796-4a3e-985b-7c240fd0158b-kube-api-access-8wxmb\") pod \"2d0143f0-3796-4a3e-985b-7c240fd0158b\" (UID: \"2d0143f0-3796-4a3e-985b-7c240fd0158b\") " Oct 03 18:43:58 crc kubenswrapper[4835]: I1003 18:43:58.730220 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d0143f0-3796-4a3e-985b-7c240fd0158b-inventory\") pod \"2d0143f0-3796-4a3e-985b-7c240fd0158b\" (UID: \"2d0143f0-3796-4a3e-985b-7c240fd0158b\") " Oct 03 18:43:58 crc kubenswrapper[4835]: I1003 18:43:58.730278 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d0143f0-3796-4a3e-985b-7c240fd0158b-ssh-key\") pod \"2d0143f0-3796-4a3e-985b-7c240fd0158b\" (UID: \"2d0143f0-3796-4a3e-985b-7c240fd0158b\") " Oct 03 18:43:58 crc kubenswrapper[4835]: I1003 18:43:58.735884 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d0143f0-3796-4a3e-985b-7c240fd0158b-kube-api-access-8wxmb" (OuterVolumeSpecName: "kube-api-access-8wxmb") pod "2d0143f0-3796-4a3e-985b-7c240fd0158b" (UID: "2d0143f0-3796-4a3e-985b-7c240fd0158b"). InnerVolumeSpecName "kube-api-access-8wxmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:43:58 crc kubenswrapper[4835]: I1003 18:43:58.756415 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d0143f0-3796-4a3e-985b-7c240fd0158b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2d0143f0-3796-4a3e-985b-7c240fd0158b" (UID: "2d0143f0-3796-4a3e-985b-7c240fd0158b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:43:58 crc kubenswrapper[4835]: I1003 18:43:58.756463 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d0143f0-3796-4a3e-985b-7c240fd0158b-inventory" (OuterVolumeSpecName: "inventory") pod "2d0143f0-3796-4a3e-985b-7c240fd0158b" (UID: "2d0143f0-3796-4a3e-985b-7c240fd0158b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:43:58 crc kubenswrapper[4835]: I1003 18:43:58.832769 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d0143f0-3796-4a3e-985b-7c240fd0158b-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:43:58 crc kubenswrapper[4835]: I1003 18:43:58.832808 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d0143f0-3796-4a3e-985b-7c240fd0158b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:43:58 crc kubenswrapper[4835]: I1003 18:43:58.832819 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wxmb\" (UniqueName: \"kubernetes.io/projected/2d0143f0-3796-4a3e-985b-7c240fd0158b-kube-api-access-8wxmb\") on node \"crc\" DevicePath \"\"" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.185722 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" event={"ID":"2d0143f0-3796-4a3e-985b-7c240fd0158b","Type":"ContainerDied","Data":"20a85aa195ae1402c1b361d97187de2ca10b8b297f7bc35b352fb8baf2cc2058"} Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.185754 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20a85aa195ae1402c1b361d97187de2ca10b8b297f7bc35b352fb8baf2cc2058" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.185834 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.270979 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9"] Oct 03 18:43:59 crc kubenswrapper[4835]: E1003 18:43:59.271459 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d0143f0-3796-4a3e-985b-7c240fd0158b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.271489 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d0143f0-3796-4a3e-985b-7c240fd0158b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.271744 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d0143f0-3796-4a3e-985b-7c240fd0158b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.272855 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.275434 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.275627 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.277649 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.283191 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.284558 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9"] Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.445132 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cb161f9-3a8e-40ef-999e-03b98142d09d-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gcph9\" (UID: \"2cb161f9-3a8e-40ef-999e-03b98142d09d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.445202 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cb161f9-3a8e-40ef-999e-03b98142d09d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gcph9\" (UID: \"2cb161f9-3a8e-40ef-999e-03b98142d09d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.445761 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsvdr\" (UniqueName: \"kubernetes.io/projected/2cb161f9-3a8e-40ef-999e-03b98142d09d-kube-api-access-rsvdr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gcph9\" (UID: \"2cb161f9-3a8e-40ef-999e-03b98142d09d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.547914 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cb161f9-3a8e-40ef-999e-03b98142d09d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gcph9\" (UID: \"2cb161f9-3a8e-40ef-999e-03b98142d09d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.548127 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsvdr\" (UniqueName: \"kubernetes.io/projected/2cb161f9-3a8e-40ef-999e-03b98142d09d-kube-api-access-rsvdr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gcph9\" (UID: \"2cb161f9-3a8e-40ef-999e-03b98142d09d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.548175 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cb161f9-3a8e-40ef-999e-03b98142d09d-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gcph9\" (UID: \"2cb161f9-3a8e-40ef-999e-03b98142d09d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.552373 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cb161f9-3a8e-40ef-999e-03b98142d09d-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gcph9\" (UID: \"2cb161f9-3a8e-40ef-999e-03b98142d09d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.553914 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cb161f9-3a8e-40ef-999e-03b98142d09d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gcph9\" (UID: \"2cb161f9-3a8e-40ef-999e-03b98142d09d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.565783 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsvdr\" (UniqueName: \"kubernetes.io/projected/2cb161f9-3a8e-40ef-999e-03b98142d09d-kube-api-access-rsvdr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-gcph9\" (UID: \"2cb161f9-3a8e-40ef-999e-03b98142d09d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" Oct 03 18:43:59 crc kubenswrapper[4835]: I1003 18:43:59.588815 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" Oct 03 18:44:00 crc kubenswrapper[4835]: I1003 18:44:00.109994 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9"] Oct 03 18:44:00 crc kubenswrapper[4835]: I1003 18:44:00.194737 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" event={"ID":"2cb161f9-3a8e-40ef-999e-03b98142d09d","Type":"ContainerStarted","Data":"a75f6167de661df0832d17081fa3b4c435bc19a63798de7dcb86c561f1280cf4"} Oct 03 18:44:01 crc kubenswrapper[4835]: I1003 18:44:01.034758 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0521-account-create-bb2fh"] Oct 03 18:44:01 crc kubenswrapper[4835]: I1003 18:44:01.045688 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4642-account-create-9fv4f"] Oct 03 18:44:01 crc kubenswrapper[4835]: I1003 18:44:01.054063 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3b08-account-create-ddf9g"] Oct 03 18:44:01 crc kubenswrapper[4835]: I1003 18:44:01.061319 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0521-account-create-bb2fh"] Oct 03 18:44:01 crc kubenswrapper[4835]: I1003 18:44:01.079467 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4642-account-create-9fv4f"] Oct 03 18:44:01 crc kubenswrapper[4835]: I1003 18:44:01.089279 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3b08-account-create-ddf9g"] Oct 03 18:44:01 crc kubenswrapper[4835]: I1003 18:44:01.207324 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" event={"ID":"2cb161f9-3a8e-40ef-999e-03b98142d09d","Type":"ContainerStarted","Data":"d23be7f1db954417284984539361c373892fcdc31460cc0623635e2d0bb13579"} Oct 03 18:44:01 crc kubenswrapper[4835]: I1003 18:44:01.225499 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" podStartSLOduration=1.730050233 podStartE2EDuration="2.225478891s" podCreationTimestamp="2025-10-03 18:43:59 +0000 UTC" firstStartedPulling="2025-10-03 18:44:00.112192304 +0000 UTC m=+1781.828133176" lastFinishedPulling="2025-10-03 18:44:00.607620962 +0000 UTC m=+1782.323561834" observedRunningTime="2025-10-03 18:44:01.222978 +0000 UTC m=+1782.938918872" watchObservedRunningTime="2025-10-03 18:44:01.225478891 +0000 UTC m=+1782.941419783" Oct 03 18:44:02 crc kubenswrapper[4835]: I1003 18:44:02.889739 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5041363c-5aa7-4917-9a7e-0c3fbc478222" path="/var/lib/kubelet/pods/5041363c-5aa7-4917-9a7e-0c3fbc478222/volumes" Oct 03 18:44:02 crc kubenswrapper[4835]: I1003 18:44:02.890566 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64736945-0cbb-43dc-9311-4c03123c878b" path="/var/lib/kubelet/pods/64736945-0cbb-43dc-9311-4c03123c878b/volumes" Oct 03 18:44:02 crc kubenswrapper[4835]: I1003 18:44:02.891047 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76843d0-52ea-4ef4-ad05-460c819fed88" path="/var/lib/kubelet/pods/b76843d0-52ea-4ef4-ad05-460c819fed88/volumes" Oct 03 18:44:06 crc kubenswrapper[4835]: I1003 18:44:06.249404 4835 generic.go:334] "Generic (PLEG): container finished" podID="2cb161f9-3a8e-40ef-999e-03b98142d09d" containerID="d23be7f1db954417284984539361c373892fcdc31460cc0623635e2d0bb13579" exitCode=0 Oct 03 18:44:06 crc kubenswrapper[4835]: I1003 18:44:06.249506 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" event={"ID":"2cb161f9-3a8e-40ef-999e-03b98142d09d","Type":"ContainerDied","Data":"d23be7f1db954417284984539361c373892fcdc31460cc0623635e2d0bb13579"} Oct 03 18:44:07 crc kubenswrapper[4835]: I1003 18:44:07.642707 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" Oct 03 18:44:07 crc kubenswrapper[4835]: I1003 18:44:07.806828 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cb161f9-3a8e-40ef-999e-03b98142d09d-ssh-key\") pod \"2cb161f9-3a8e-40ef-999e-03b98142d09d\" (UID: \"2cb161f9-3a8e-40ef-999e-03b98142d09d\") " Oct 03 18:44:07 crc kubenswrapper[4835]: I1003 18:44:07.806885 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsvdr\" (UniqueName: \"kubernetes.io/projected/2cb161f9-3a8e-40ef-999e-03b98142d09d-kube-api-access-rsvdr\") pod \"2cb161f9-3a8e-40ef-999e-03b98142d09d\" (UID: \"2cb161f9-3a8e-40ef-999e-03b98142d09d\") " Oct 03 18:44:07 crc kubenswrapper[4835]: I1003 18:44:07.807006 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cb161f9-3a8e-40ef-999e-03b98142d09d-inventory\") pod \"2cb161f9-3a8e-40ef-999e-03b98142d09d\" (UID: \"2cb161f9-3a8e-40ef-999e-03b98142d09d\") " Oct 03 18:44:07 crc kubenswrapper[4835]: I1003 18:44:07.812285 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb161f9-3a8e-40ef-999e-03b98142d09d-kube-api-access-rsvdr" (OuterVolumeSpecName: "kube-api-access-rsvdr") pod "2cb161f9-3a8e-40ef-999e-03b98142d09d" (UID: "2cb161f9-3a8e-40ef-999e-03b98142d09d"). InnerVolumeSpecName "kube-api-access-rsvdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:44:07 crc kubenswrapper[4835]: I1003 18:44:07.835404 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb161f9-3a8e-40ef-999e-03b98142d09d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2cb161f9-3a8e-40ef-999e-03b98142d09d" (UID: "2cb161f9-3a8e-40ef-999e-03b98142d09d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:44:07 crc kubenswrapper[4835]: I1003 18:44:07.835539 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb161f9-3a8e-40ef-999e-03b98142d09d-inventory" (OuterVolumeSpecName: "inventory") pod "2cb161f9-3a8e-40ef-999e-03b98142d09d" (UID: "2cb161f9-3a8e-40ef-999e-03b98142d09d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:44:07 crc kubenswrapper[4835]: I1003 18:44:07.910252 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2cb161f9-3a8e-40ef-999e-03b98142d09d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:44:07 crc kubenswrapper[4835]: I1003 18:44:07.910286 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsvdr\" (UniqueName: \"kubernetes.io/projected/2cb161f9-3a8e-40ef-999e-03b98142d09d-kube-api-access-rsvdr\") on node \"crc\" DevicePath \"\"" Oct 03 18:44:07 crc kubenswrapper[4835]: I1003 18:44:07.910298 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cb161f9-3a8e-40ef-999e-03b98142d09d-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.268088 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" event={"ID":"2cb161f9-3a8e-40ef-999e-03b98142d09d","Type":"ContainerDied","Data":"a75f6167de661df0832d17081fa3b4c435bc19a63798de7dcb86c561f1280cf4"} Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.268406 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a75f6167de661df0832d17081fa3b4c435bc19a63798de7dcb86c561f1280cf4" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.268124 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-gcph9" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.338235 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws"] Oct 03 18:44:08 crc kubenswrapper[4835]: E1003 18:44:08.338649 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb161f9-3a8e-40ef-999e-03b98142d09d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.338666 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb161f9-3a8e-40ef-999e-03b98142d09d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.338883 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb161f9-3a8e-40ef-999e-03b98142d09d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.339685 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.342191 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.343495 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.343549 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.345533 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.349477 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws"] Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.522868 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lx8h\" (UniqueName: \"kubernetes.io/projected/b230c3be-e2a6-49eb-90f7-97732a8be2ad-kube-api-access-5lx8h\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gpsws\" (UID: \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.523260 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b230c3be-e2a6-49eb-90f7-97732a8be2ad-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gpsws\" (UID: \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.523400 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b230c3be-e2a6-49eb-90f7-97732a8be2ad-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gpsws\" (UID: \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.624941 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lx8h\" (UniqueName: \"kubernetes.io/projected/b230c3be-e2a6-49eb-90f7-97732a8be2ad-kube-api-access-5lx8h\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gpsws\" (UID: \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.625275 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b230c3be-e2a6-49eb-90f7-97732a8be2ad-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gpsws\" (UID: \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.625427 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b230c3be-e2a6-49eb-90f7-97732a8be2ad-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gpsws\" (UID: \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.630336 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b230c3be-e2a6-49eb-90f7-97732a8be2ad-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gpsws\" (UID: \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.636491 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b230c3be-e2a6-49eb-90f7-97732a8be2ad-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gpsws\" (UID: \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.641781 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lx8h\" (UniqueName: \"kubernetes.io/projected/b230c3be-e2a6-49eb-90f7-97732a8be2ad-kube-api-access-5lx8h\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-gpsws\" (UID: \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.662191 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" Oct 03 18:44:08 crc kubenswrapper[4835]: I1003 18:44:08.885518 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:44:08 crc kubenswrapper[4835]: E1003 18:44:08.889177 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:44:09 crc kubenswrapper[4835]: I1003 18:44:09.187554 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws"] Oct 03 18:44:09 crc kubenswrapper[4835]: I1003 18:44:09.277178 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" event={"ID":"b230c3be-e2a6-49eb-90f7-97732a8be2ad","Type":"ContainerStarted","Data":"3feb688f7150b8d1e3da68f0bacf74c4d61a17c70ac565492b8237b252df5212"} Oct 03 18:44:10 crc kubenswrapper[4835]: I1003 18:44:10.286694 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" event={"ID":"b230c3be-e2a6-49eb-90f7-97732a8be2ad","Type":"ContainerStarted","Data":"dbba7be5b7d63a207a9ee004d13d267ee701cc6fe2c85fde33f8c0c7715b1d87"} Oct 03 18:44:10 crc kubenswrapper[4835]: I1003 18:44:10.305623 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" podStartSLOduration=1.9176668810000002 podStartE2EDuration="2.305603886s" podCreationTimestamp="2025-10-03 18:44:08 +0000 UTC" firstStartedPulling="2025-10-03 18:44:09.199992557 +0000 UTC m=+1790.915933429" lastFinishedPulling="2025-10-03 18:44:09.587929562 +0000 UTC m=+1791.303870434" observedRunningTime="2025-10-03 18:44:10.303244207 +0000 UTC m=+1792.019185079" watchObservedRunningTime="2025-10-03 18:44:10.305603886 +0000 UTC m=+1792.021544758" Oct 03 18:44:20 crc kubenswrapper[4835]: I1003 18:44:20.877233 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:44:20 crc kubenswrapper[4835]: E1003 18:44:20.878171 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:44:29 crc kubenswrapper[4835]: I1003 18:44:29.040972 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c9pn5"] Oct 03 18:44:29 crc kubenswrapper[4835]: I1003 18:44:29.052178 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-c9pn5"] Oct 03 18:44:30 crc kubenswrapper[4835]: I1003 18:44:30.889309 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9703db0d-8373-4427-98a0-c36d069f7c71" path="/var/lib/kubelet/pods/9703db0d-8373-4427-98a0-c36d069f7c71/volumes" Oct 03 18:44:31 crc kubenswrapper[4835]: I1003 18:44:31.876532 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:44:31 crc kubenswrapper[4835]: E1003 18:44:31.877048 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:44:42 crc kubenswrapper[4835]: I1003 18:44:42.402506 4835 scope.go:117] "RemoveContainer" containerID="503b690c67c08bddcd3404403757244f7fe7841067c825256213f9247328855a" Oct 03 18:44:42 crc kubenswrapper[4835]: I1003 18:44:42.430633 4835 scope.go:117] "RemoveContainer" containerID="97a200106343d17c7b5557a76f0f908bf71800bbbd2581afc6ddfc69e637248a" Oct 03 18:44:42 crc kubenswrapper[4835]: I1003 18:44:42.497034 4835 scope.go:117] "RemoveContainer" containerID="3d4b88308c914465b10d8db996b8bb3ca48477dff927d18fdf94c1236e2a76dd" Oct 03 18:44:42 crc kubenswrapper[4835]: I1003 18:44:42.557189 4835 scope.go:117] "RemoveContainer" containerID="dc000fb4deab8ffabde1216d155dc0aea2824f8bf6e830491125bb95ba339aa4" Oct 03 18:44:42 crc kubenswrapper[4835]: I1003 18:44:42.603994 4835 scope.go:117] "RemoveContainer" containerID="186d768c80cff48d5e649af26c5f1e1742246a8f2c5cf136125676814e95f320" Oct 03 18:44:42 crc kubenswrapper[4835]: I1003 18:44:42.634860 4835 scope.go:117] "RemoveContainer" containerID="4e511619ab70773b888b62bab0bc38f6388687b9ed0b22698077a37c6edcb38f" Oct 03 18:44:42 crc kubenswrapper[4835]: I1003 18:44:42.676477 4835 scope.go:117] "RemoveContainer" containerID="71c8fdd79207970fbc9654e70c23a172a47f0fdab08777e00c655f5282cbb845" Oct 03 18:44:46 crc kubenswrapper[4835]: I1003 18:44:46.877637 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:44:46 crc kubenswrapper[4835]: E1003 18:44:46.878576 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:44:51 crc kubenswrapper[4835]: I1003 18:44:51.686362 4835 generic.go:334] "Generic (PLEG): container finished" podID="b230c3be-e2a6-49eb-90f7-97732a8be2ad" containerID="dbba7be5b7d63a207a9ee004d13d267ee701cc6fe2c85fde33f8c0c7715b1d87" exitCode=0 Oct 03 18:44:51 crc kubenswrapper[4835]: I1003 18:44:51.686450 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" event={"ID":"b230c3be-e2a6-49eb-90f7-97732a8be2ad","Type":"ContainerDied","Data":"dbba7be5b7d63a207a9ee004d13d267ee701cc6fe2c85fde33f8c0c7715b1d87"} Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.134416 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.317558 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b230c3be-e2a6-49eb-90f7-97732a8be2ad-ssh-key\") pod \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\" (UID: \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\") " Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.317680 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lx8h\" (UniqueName: \"kubernetes.io/projected/b230c3be-e2a6-49eb-90f7-97732a8be2ad-kube-api-access-5lx8h\") pod \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\" (UID: \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\") " Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.317769 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b230c3be-e2a6-49eb-90f7-97732a8be2ad-inventory\") pod \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\" (UID: \"b230c3be-e2a6-49eb-90f7-97732a8be2ad\") " Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.323257 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b230c3be-e2a6-49eb-90f7-97732a8be2ad-kube-api-access-5lx8h" (OuterVolumeSpecName: "kube-api-access-5lx8h") pod "b230c3be-e2a6-49eb-90f7-97732a8be2ad" (UID: "b230c3be-e2a6-49eb-90f7-97732a8be2ad"). InnerVolumeSpecName "kube-api-access-5lx8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.346297 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b230c3be-e2a6-49eb-90f7-97732a8be2ad-inventory" (OuterVolumeSpecName: "inventory") pod "b230c3be-e2a6-49eb-90f7-97732a8be2ad" (UID: "b230c3be-e2a6-49eb-90f7-97732a8be2ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.359317 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b230c3be-e2a6-49eb-90f7-97732a8be2ad-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b230c3be-e2a6-49eb-90f7-97732a8be2ad" (UID: "b230c3be-e2a6-49eb-90f7-97732a8be2ad"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.423776 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lx8h\" (UniqueName: \"kubernetes.io/projected/b230c3be-e2a6-49eb-90f7-97732a8be2ad-kube-api-access-5lx8h\") on node \"crc\" DevicePath \"\"" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.423805 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b230c3be-e2a6-49eb-90f7-97732a8be2ad-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.423814 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b230c3be-e2a6-49eb-90f7-97732a8be2ad-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.702567 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" event={"ID":"b230c3be-e2a6-49eb-90f7-97732a8be2ad","Type":"ContainerDied","Data":"3feb688f7150b8d1e3da68f0bacf74c4d61a17c70ac565492b8237b252df5212"} Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.702863 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3feb688f7150b8d1e3da68f0bacf74c4d61a17c70ac565492b8237b252df5212" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.702656 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-gpsws" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.787805 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2"] Oct 03 18:44:53 crc kubenswrapper[4835]: E1003 18:44:53.788234 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b230c3be-e2a6-49eb-90f7-97732a8be2ad" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.788254 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b230c3be-e2a6-49eb-90f7-97732a8be2ad" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.788510 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b230c3be-e2a6-49eb-90f7-97732a8be2ad" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.789358 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.792471 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.793779 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.793804 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.803037 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2"] Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.803249 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.931808 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ee17011-d405-4e45-84c9-b48eb4ec6820-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2\" (UID: \"2ee17011-d405-4e45-84c9-b48eb4ec6820\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.932692 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdppk\" (UniqueName: \"kubernetes.io/projected/2ee17011-d405-4e45-84c9-b48eb4ec6820-kube-api-access-mdppk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2\" (UID: \"2ee17011-d405-4e45-84c9-b48eb4ec6820\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" Oct 03 18:44:53 crc kubenswrapper[4835]: I1003 18:44:53.932906 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ee17011-d405-4e45-84c9-b48eb4ec6820-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2\" (UID: \"2ee17011-d405-4e45-84c9-b48eb4ec6820\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" Oct 03 18:44:54 crc kubenswrapper[4835]: I1003 18:44:54.034561 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ee17011-d405-4e45-84c9-b48eb4ec6820-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2\" (UID: \"2ee17011-d405-4e45-84c9-b48eb4ec6820\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" Oct 03 18:44:54 crc kubenswrapper[4835]: I1003 18:44:54.034634 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdppk\" (UniqueName: \"kubernetes.io/projected/2ee17011-d405-4e45-84c9-b48eb4ec6820-kube-api-access-mdppk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2\" (UID: \"2ee17011-d405-4e45-84c9-b48eb4ec6820\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" Oct 03 18:44:54 crc kubenswrapper[4835]: I1003 18:44:54.034758 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ee17011-d405-4e45-84c9-b48eb4ec6820-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2\" (UID: \"2ee17011-d405-4e45-84c9-b48eb4ec6820\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" Oct 03 18:44:54 crc kubenswrapper[4835]: I1003 18:44:54.040121 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ee17011-d405-4e45-84c9-b48eb4ec6820-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2\" (UID: \"2ee17011-d405-4e45-84c9-b48eb4ec6820\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" Oct 03 18:44:54 crc kubenswrapper[4835]: I1003 18:44:54.042489 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ee17011-d405-4e45-84c9-b48eb4ec6820-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2\" (UID: \"2ee17011-d405-4e45-84c9-b48eb4ec6820\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" Oct 03 18:44:54 crc kubenswrapper[4835]: I1003 18:44:54.042612 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-7jx5c"] Oct 03 18:44:54 crc kubenswrapper[4835]: I1003 18:44:54.050533 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-7jx5c"] Oct 03 18:44:54 crc kubenswrapper[4835]: I1003 18:44:54.055931 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdppk\" (UniqueName: \"kubernetes.io/projected/2ee17011-d405-4e45-84c9-b48eb4ec6820-kube-api-access-mdppk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2\" (UID: \"2ee17011-d405-4e45-84c9-b48eb4ec6820\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" Oct 03 18:44:54 crc kubenswrapper[4835]: I1003 18:44:54.108523 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" Oct 03 18:44:54 crc kubenswrapper[4835]: I1003 18:44:54.588689 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2"] Oct 03 18:44:54 crc kubenswrapper[4835]: I1003 18:44:54.712827 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" event={"ID":"2ee17011-d405-4e45-84c9-b48eb4ec6820","Type":"ContainerStarted","Data":"14270b08cc0d7a686f41e4d4bff58d7330977b583cb8dcc20a47e44b0b3c233b"} Oct 03 18:44:54 crc kubenswrapper[4835]: I1003 18:44:54.887892 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce857dee-2b33-413b-8040-6012915be992" path="/var/lib/kubelet/pods/ce857dee-2b33-413b-8040-6012915be992/volumes" Oct 03 18:44:55 crc kubenswrapper[4835]: I1003 18:44:55.032001 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7kxdv"] Oct 03 18:44:55 crc kubenswrapper[4835]: I1003 18:44:55.039194 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7kxdv"] Oct 03 18:44:55 crc kubenswrapper[4835]: I1003 18:44:55.721970 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" event={"ID":"2ee17011-d405-4e45-84c9-b48eb4ec6820","Type":"ContainerStarted","Data":"6402531c1f56848fea2ddef31486ecec1018274a001951deb94c085fad089b8f"} Oct 03 18:44:55 crc kubenswrapper[4835]: I1003 18:44:55.743962 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" podStartSLOduration=2.142995911 podStartE2EDuration="2.743942234s" podCreationTimestamp="2025-10-03 18:44:53 +0000 UTC" firstStartedPulling="2025-10-03 18:44:54.595505898 +0000 UTC m=+1836.311446770" lastFinishedPulling="2025-10-03 18:44:55.196452221 +0000 UTC m=+1836.912393093" observedRunningTime="2025-10-03 18:44:55.736208013 +0000 UTC m=+1837.452148895" watchObservedRunningTime="2025-10-03 18:44:55.743942234 +0000 UTC m=+1837.459883106" Oct 03 18:44:56 crc kubenswrapper[4835]: I1003 18:44:56.917291 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbaaef5c-4627-46d0-8673-1cf9767ab4d6" path="/var/lib/kubelet/pods/bbaaef5c-4627-46d0-8673-1cf9767ab4d6/volumes" Oct 03 18:44:57 crc kubenswrapper[4835]: I1003 18:44:57.877132 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:44:57 crc kubenswrapper[4835]: E1003 18:44:57.877373 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.133035 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6"] Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.134884 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.139616 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.139905 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.147956 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6"] Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.246464 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqhfn\" (UniqueName: \"kubernetes.io/projected/5af9ff5a-2dca-432e-888e-8e5e11bbabff-kube-api-access-xqhfn\") pod \"collect-profiles-29325285-gwgt6\" (UID: \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.246525 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5af9ff5a-2dca-432e-888e-8e5e11bbabff-config-volume\") pod \"collect-profiles-29325285-gwgt6\" (UID: \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.246855 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5af9ff5a-2dca-432e-888e-8e5e11bbabff-secret-volume\") pod \"collect-profiles-29325285-gwgt6\" (UID: \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.349118 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5af9ff5a-2dca-432e-888e-8e5e11bbabff-secret-volume\") pod \"collect-profiles-29325285-gwgt6\" (UID: \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.349219 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqhfn\" (UniqueName: \"kubernetes.io/projected/5af9ff5a-2dca-432e-888e-8e5e11bbabff-kube-api-access-xqhfn\") pod \"collect-profiles-29325285-gwgt6\" (UID: \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.349252 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5af9ff5a-2dca-432e-888e-8e5e11bbabff-config-volume\") pod \"collect-profiles-29325285-gwgt6\" (UID: \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.351729 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5af9ff5a-2dca-432e-888e-8e5e11bbabff-config-volume\") pod \"collect-profiles-29325285-gwgt6\" (UID: \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.354707 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5af9ff5a-2dca-432e-888e-8e5e11bbabff-secret-volume\") pod \"collect-profiles-29325285-gwgt6\" (UID: \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.367972 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqhfn\" (UniqueName: \"kubernetes.io/projected/5af9ff5a-2dca-432e-888e-8e5e11bbabff-kube-api-access-xqhfn\") pod \"collect-profiles-29325285-gwgt6\" (UID: \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.465120 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" Oct 03 18:45:00 crc kubenswrapper[4835]: I1003 18:45:00.891637 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6"] Oct 03 18:45:01 crc kubenswrapper[4835]: I1003 18:45:01.771207 4835 generic.go:334] "Generic (PLEG): container finished" podID="5af9ff5a-2dca-432e-888e-8e5e11bbabff" containerID="fe4ad2d44d3c5d719d9454f465dbdd26b3155b83916f17a834f313f4aa65c79f" exitCode=0 Oct 03 18:45:01 crc kubenswrapper[4835]: I1003 18:45:01.771316 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" event={"ID":"5af9ff5a-2dca-432e-888e-8e5e11bbabff","Type":"ContainerDied","Data":"fe4ad2d44d3c5d719d9454f465dbdd26b3155b83916f17a834f313f4aa65c79f"} Oct 03 18:45:01 crc kubenswrapper[4835]: I1003 18:45:01.771561 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" event={"ID":"5af9ff5a-2dca-432e-888e-8e5e11bbabff","Type":"ContainerStarted","Data":"750d0eb933a6bcb8c40e86a6b4e645d6e9726c3da2e39abe0b65001b28939ef5"} Oct 03 18:45:03 crc kubenswrapper[4835]: I1003 18:45:03.087515 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" Oct 03 18:45:03 crc kubenswrapper[4835]: I1003 18:45:03.207957 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqhfn\" (UniqueName: \"kubernetes.io/projected/5af9ff5a-2dca-432e-888e-8e5e11bbabff-kube-api-access-xqhfn\") pod \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\" (UID: \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\") " Oct 03 18:45:03 crc kubenswrapper[4835]: I1003 18:45:03.208640 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5af9ff5a-2dca-432e-888e-8e5e11bbabff-secret-volume\") pod \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\" (UID: \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\") " Oct 03 18:45:03 crc kubenswrapper[4835]: I1003 18:45:03.208799 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5af9ff5a-2dca-432e-888e-8e5e11bbabff-config-volume\") pod \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\" (UID: \"5af9ff5a-2dca-432e-888e-8e5e11bbabff\") " Oct 03 18:45:03 crc kubenswrapper[4835]: I1003 18:45:03.209875 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5af9ff5a-2dca-432e-888e-8e5e11bbabff-config-volume" (OuterVolumeSpecName: "config-volume") pod "5af9ff5a-2dca-432e-888e-8e5e11bbabff" (UID: "5af9ff5a-2dca-432e-888e-8e5e11bbabff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:45:03 crc kubenswrapper[4835]: I1003 18:45:03.214513 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5af9ff5a-2dca-432e-888e-8e5e11bbabff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5af9ff5a-2dca-432e-888e-8e5e11bbabff" (UID: "5af9ff5a-2dca-432e-888e-8e5e11bbabff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:45:03 crc kubenswrapper[4835]: I1003 18:45:03.217499 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5af9ff5a-2dca-432e-888e-8e5e11bbabff-kube-api-access-xqhfn" (OuterVolumeSpecName: "kube-api-access-xqhfn") pod "5af9ff5a-2dca-432e-888e-8e5e11bbabff" (UID: "5af9ff5a-2dca-432e-888e-8e5e11bbabff"). InnerVolumeSpecName "kube-api-access-xqhfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:45:03 crc kubenswrapper[4835]: I1003 18:45:03.310847 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5af9ff5a-2dca-432e-888e-8e5e11bbabff-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 18:45:03 crc kubenswrapper[4835]: I1003 18:45:03.311145 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5af9ff5a-2dca-432e-888e-8e5e11bbabff-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 18:45:03 crc kubenswrapper[4835]: I1003 18:45:03.311238 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqhfn\" (UniqueName: \"kubernetes.io/projected/5af9ff5a-2dca-432e-888e-8e5e11bbabff-kube-api-access-xqhfn\") on node \"crc\" DevicePath \"\"" Oct 03 18:45:03 crc kubenswrapper[4835]: I1003 18:45:03.788693 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" Oct 03 18:45:03 crc kubenswrapper[4835]: I1003 18:45:03.788726 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6" event={"ID":"5af9ff5a-2dca-432e-888e-8e5e11bbabff","Type":"ContainerDied","Data":"750d0eb933a6bcb8c40e86a6b4e645d6e9726c3da2e39abe0b65001b28939ef5"} Oct 03 18:45:03 crc kubenswrapper[4835]: I1003 18:45:03.789109 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="750d0eb933a6bcb8c40e86a6b4e645d6e9726c3da2e39abe0b65001b28939ef5" Oct 03 18:45:11 crc kubenswrapper[4835]: I1003 18:45:11.877866 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:45:12 crc kubenswrapper[4835]: I1003 18:45:12.864498 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"5deb0cf9b1410690f101cc5c84ba400be4c91a67e6bec74de8589a872a3c0d30"} Oct 03 18:45:37 crc kubenswrapper[4835]: I1003 18:45:37.060216 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-f49gn"] Oct 03 18:45:37 crc kubenswrapper[4835]: I1003 18:45:37.073416 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-f49gn"] Oct 03 18:45:38 crc kubenswrapper[4835]: I1003 18:45:38.892148 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34c9c48d-f7af-44a7-ad66-ecaeefca60a5" path="/var/lib/kubelet/pods/34c9c48d-f7af-44a7-ad66-ecaeefca60a5/volumes" Oct 03 18:45:42 crc kubenswrapper[4835]: I1003 18:45:42.808802 4835 scope.go:117] "RemoveContainer" containerID="668e0906abe850415158c66fcbd94a09b8b5348b23de4d3d223e7dd30674d154" Oct 03 18:45:42 crc kubenswrapper[4835]: I1003 18:45:42.850238 4835 scope.go:117] "RemoveContainer" containerID="89f2716fb9948d66f1da8169f84213d5c6f6d515d31ef7074dcd34df44aa9f39" Oct 03 18:45:42 crc kubenswrapper[4835]: I1003 18:45:42.890909 4835 scope.go:117] "RemoveContainer" containerID="1d9f3c19bfad5d80e939598b95d6962719777161a191deecbf5ecbb7586c159a" Oct 03 18:45:51 crc kubenswrapper[4835]: I1003 18:45:51.225689 4835 generic.go:334] "Generic (PLEG): container finished" podID="2ee17011-d405-4e45-84c9-b48eb4ec6820" containerID="6402531c1f56848fea2ddef31486ecec1018274a001951deb94c085fad089b8f" exitCode=2 Oct 03 18:45:51 crc kubenswrapper[4835]: I1003 18:45:51.225768 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" event={"ID":"2ee17011-d405-4e45-84c9-b48eb4ec6820","Type":"ContainerDied","Data":"6402531c1f56848fea2ddef31486ecec1018274a001951deb94c085fad089b8f"} Oct 03 18:45:52 crc kubenswrapper[4835]: I1003 18:45:52.702860 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" Oct 03 18:45:52 crc kubenswrapper[4835]: I1003 18:45:52.860674 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdppk\" (UniqueName: \"kubernetes.io/projected/2ee17011-d405-4e45-84c9-b48eb4ec6820-kube-api-access-mdppk\") pod \"2ee17011-d405-4e45-84c9-b48eb4ec6820\" (UID: \"2ee17011-d405-4e45-84c9-b48eb4ec6820\") " Oct 03 18:45:52 crc kubenswrapper[4835]: I1003 18:45:52.861184 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ee17011-d405-4e45-84c9-b48eb4ec6820-inventory\") pod \"2ee17011-d405-4e45-84c9-b48eb4ec6820\" (UID: \"2ee17011-d405-4e45-84c9-b48eb4ec6820\") " Oct 03 18:45:52 crc kubenswrapper[4835]: I1003 18:45:52.861295 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ee17011-d405-4e45-84c9-b48eb4ec6820-ssh-key\") pod \"2ee17011-d405-4e45-84c9-b48eb4ec6820\" (UID: \"2ee17011-d405-4e45-84c9-b48eb4ec6820\") " Oct 03 18:45:52 crc kubenswrapper[4835]: I1003 18:45:52.867349 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee17011-d405-4e45-84c9-b48eb4ec6820-kube-api-access-mdppk" (OuterVolumeSpecName: "kube-api-access-mdppk") pod "2ee17011-d405-4e45-84c9-b48eb4ec6820" (UID: "2ee17011-d405-4e45-84c9-b48eb4ec6820"). InnerVolumeSpecName "kube-api-access-mdppk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:45:52 crc kubenswrapper[4835]: I1003 18:45:52.896469 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee17011-d405-4e45-84c9-b48eb4ec6820-inventory" (OuterVolumeSpecName: "inventory") pod "2ee17011-d405-4e45-84c9-b48eb4ec6820" (UID: "2ee17011-d405-4e45-84c9-b48eb4ec6820"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:45:52 crc kubenswrapper[4835]: I1003 18:45:52.899270 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee17011-d405-4e45-84c9-b48eb4ec6820-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2ee17011-d405-4e45-84c9-b48eb4ec6820" (UID: "2ee17011-d405-4e45-84c9-b48eb4ec6820"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:45:52 crc kubenswrapper[4835]: I1003 18:45:52.963860 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdppk\" (UniqueName: \"kubernetes.io/projected/2ee17011-d405-4e45-84c9-b48eb4ec6820-kube-api-access-mdppk\") on node \"crc\" DevicePath \"\"" Oct 03 18:45:52 crc kubenswrapper[4835]: I1003 18:45:52.964086 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ee17011-d405-4e45-84c9-b48eb4ec6820-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:45:52 crc kubenswrapper[4835]: I1003 18:45:52.964170 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2ee17011-d405-4e45-84c9-b48eb4ec6820-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:45:53 crc kubenswrapper[4835]: I1003 18:45:53.244724 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" event={"ID":"2ee17011-d405-4e45-84c9-b48eb4ec6820","Type":"ContainerDied","Data":"14270b08cc0d7a686f41e4d4bff58d7330977b583cb8dcc20a47e44b0b3c233b"} Oct 03 18:45:53 crc kubenswrapper[4835]: I1003 18:45:53.244770 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14270b08cc0d7a686f41e4d4bff58d7330977b583cb8dcc20a47e44b0b3c233b" Oct 03 18:45:53 crc kubenswrapper[4835]: I1003 18:45:53.244783 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.030701 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5"] Oct 03 18:46:00 crc kubenswrapper[4835]: E1003 18:46:00.031826 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5af9ff5a-2dca-432e-888e-8e5e11bbabff" containerName="collect-profiles" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.031841 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af9ff5a-2dca-432e-888e-8e5e11bbabff" containerName="collect-profiles" Oct 03 18:46:00 crc kubenswrapper[4835]: E1003 18:46:00.031886 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee17011-d405-4e45-84c9-b48eb4ec6820" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.031893 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee17011-d405-4e45-84c9-b48eb4ec6820" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.032140 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5af9ff5a-2dca-432e-888e-8e5e11bbabff" containerName="collect-profiles" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.032154 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee17011-d405-4e45-84c9-b48eb4ec6820" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.032818 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.036048 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.036159 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.036284 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.036396 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.045692 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5"] Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.196213 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b1395f-37ef-43aa-94b2-4a761f358242-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5\" (UID: \"d8b1395f-37ef-43aa-94b2-4a761f358242\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.196349 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b1395f-37ef-43aa-94b2-4a761f358242-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5\" (UID: \"d8b1395f-37ef-43aa-94b2-4a761f358242\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.196400 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftsvw\" (UniqueName: \"kubernetes.io/projected/d8b1395f-37ef-43aa-94b2-4a761f358242-kube-api-access-ftsvw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5\" (UID: \"d8b1395f-37ef-43aa-94b2-4a761f358242\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.297747 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b1395f-37ef-43aa-94b2-4a761f358242-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5\" (UID: \"d8b1395f-37ef-43aa-94b2-4a761f358242\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.297926 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b1395f-37ef-43aa-94b2-4a761f358242-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5\" (UID: \"d8b1395f-37ef-43aa-94b2-4a761f358242\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.297985 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftsvw\" (UniqueName: \"kubernetes.io/projected/d8b1395f-37ef-43aa-94b2-4a761f358242-kube-api-access-ftsvw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5\" (UID: \"d8b1395f-37ef-43aa-94b2-4a761f358242\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.304280 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b1395f-37ef-43aa-94b2-4a761f358242-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5\" (UID: \"d8b1395f-37ef-43aa-94b2-4a761f358242\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.310570 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b1395f-37ef-43aa-94b2-4a761f358242-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5\" (UID: \"d8b1395f-37ef-43aa-94b2-4a761f358242\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.314562 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftsvw\" (UniqueName: \"kubernetes.io/projected/d8b1395f-37ef-43aa-94b2-4a761f358242-kube-api-access-ftsvw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5\" (UID: \"d8b1395f-37ef-43aa-94b2-4a761f358242\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.355922 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.923368 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5"] Oct 03 18:46:00 crc kubenswrapper[4835]: I1003 18:46:00.939928 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 18:46:01 crc kubenswrapper[4835]: I1003 18:46:01.323830 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" event={"ID":"d8b1395f-37ef-43aa-94b2-4a761f358242","Type":"ContainerStarted","Data":"385d01c14cc0b533b2e08c7d12a0e5b4e7eaee502c005559c01256786697d01f"} Oct 03 18:46:02 crc kubenswrapper[4835]: I1003 18:46:02.334296 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" event={"ID":"d8b1395f-37ef-43aa-94b2-4a761f358242","Type":"ContainerStarted","Data":"e1809258beaafadc5bf16d40b3a76faf50f43503d9c0adc224527c1bf221a0b4"} Oct 03 18:46:50 crc kubenswrapper[4835]: I1003 18:46:50.788595 4835 generic.go:334] "Generic (PLEG): container finished" podID="d8b1395f-37ef-43aa-94b2-4a761f358242" containerID="e1809258beaafadc5bf16d40b3a76faf50f43503d9c0adc224527c1bf221a0b4" exitCode=0 Oct 03 18:46:50 crc kubenswrapper[4835]: I1003 18:46:50.788640 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" event={"ID":"d8b1395f-37ef-43aa-94b2-4a761f358242","Type":"ContainerDied","Data":"e1809258beaafadc5bf16d40b3a76faf50f43503d9c0adc224527c1bf221a0b4"} Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.245649 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.424630 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftsvw\" (UniqueName: \"kubernetes.io/projected/d8b1395f-37ef-43aa-94b2-4a761f358242-kube-api-access-ftsvw\") pod \"d8b1395f-37ef-43aa-94b2-4a761f358242\" (UID: \"d8b1395f-37ef-43aa-94b2-4a761f358242\") " Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.424750 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b1395f-37ef-43aa-94b2-4a761f358242-inventory\") pod \"d8b1395f-37ef-43aa-94b2-4a761f358242\" (UID: \"d8b1395f-37ef-43aa-94b2-4a761f358242\") " Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.425092 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b1395f-37ef-43aa-94b2-4a761f358242-ssh-key\") pod \"d8b1395f-37ef-43aa-94b2-4a761f358242\" (UID: \"d8b1395f-37ef-43aa-94b2-4a761f358242\") " Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.431001 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b1395f-37ef-43aa-94b2-4a761f358242-kube-api-access-ftsvw" (OuterVolumeSpecName: "kube-api-access-ftsvw") pod "d8b1395f-37ef-43aa-94b2-4a761f358242" (UID: "d8b1395f-37ef-43aa-94b2-4a761f358242"). InnerVolumeSpecName "kube-api-access-ftsvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.454449 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b1395f-37ef-43aa-94b2-4a761f358242-inventory" (OuterVolumeSpecName: "inventory") pod "d8b1395f-37ef-43aa-94b2-4a761f358242" (UID: "d8b1395f-37ef-43aa-94b2-4a761f358242"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.454940 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8b1395f-37ef-43aa-94b2-4a761f358242-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d8b1395f-37ef-43aa-94b2-4a761f358242" (UID: "d8b1395f-37ef-43aa-94b2-4a761f358242"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.528949 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d8b1395f-37ef-43aa-94b2-4a761f358242-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.529020 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftsvw\" (UniqueName: \"kubernetes.io/projected/d8b1395f-37ef-43aa-94b2-4a761f358242-kube-api-access-ftsvw\") on node \"crc\" DevicePath \"\"" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.529054 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d8b1395f-37ef-43aa-94b2-4a761f358242-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.821532 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" event={"ID":"d8b1395f-37ef-43aa-94b2-4a761f358242","Type":"ContainerDied","Data":"385d01c14cc0b533b2e08c7d12a0e5b4e7eaee502c005559c01256786697d01f"} Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.821829 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="385d01c14cc0b533b2e08c7d12a0e5b4e7eaee502c005559c01256786697d01f" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.821893 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.899998 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dz75p"] Oct 03 18:46:52 crc kubenswrapper[4835]: E1003 18:46:52.900435 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b1395f-37ef-43aa-94b2-4a761f358242" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.900454 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b1395f-37ef-43aa-94b2-4a761f358242" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.900689 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b1395f-37ef-43aa-94b2-4a761f358242" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.901552 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.905533 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.905687 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.906869 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.906880 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:46:52 crc kubenswrapper[4835]: I1003 18:46:52.914725 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dz75p"] Oct 03 18:46:53 crc kubenswrapper[4835]: I1003 18:46:53.038664 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmbsr\" (UniqueName: \"kubernetes.io/projected/8579b92c-53fb-4e67-af9b-40881365b520-kube-api-access-mmbsr\") pod \"ssh-known-hosts-edpm-deployment-dz75p\" (UID: \"8579b92c-53fb-4e67-af9b-40881365b520\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" Oct 03 18:46:53 crc kubenswrapper[4835]: I1003 18:46:53.039544 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dz75p\" (UID: \"8579b92c-53fb-4e67-af9b-40881365b520\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" Oct 03 18:46:53 crc kubenswrapper[4835]: I1003 18:46:53.040403 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dz75p\" (UID: \"8579b92c-53fb-4e67-af9b-40881365b520\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" Oct 03 18:46:53 crc kubenswrapper[4835]: I1003 18:46:53.142547 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dz75p\" (UID: \"8579b92c-53fb-4e67-af9b-40881365b520\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" Oct 03 18:46:53 crc kubenswrapper[4835]: I1003 18:46:53.143163 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dz75p\" (UID: \"8579b92c-53fb-4e67-af9b-40881365b520\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" Oct 03 18:46:53 crc kubenswrapper[4835]: I1003 18:46:53.143538 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmbsr\" (UniqueName: \"kubernetes.io/projected/8579b92c-53fb-4e67-af9b-40881365b520-kube-api-access-mmbsr\") pod \"ssh-known-hosts-edpm-deployment-dz75p\" (UID: \"8579b92c-53fb-4e67-af9b-40881365b520\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" Oct 03 18:46:53 crc kubenswrapper[4835]: I1003 18:46:53.148419 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dz75p\" (UID: \"8579b92c-53fb-4e67-af9b-40881365b520\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" Oct 03 18:46:53 crc kubenswrapper[4835]: I1003 18:46:53.154265 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dz75p\" (UID: \"8579b92c-53fb-4e67-af9b-40881365b520\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" Oct 03 18:46:53 crc kubenswrapper[4835]: I1003 18:46:53.160591 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmbsr\" (UniqueName: \"kubernetes.io/projected/8579b92c-53fb-4e67-af9b-40881365b520-kube-api-access-mmbsr\") pod \"ssh-known-hosts-edpm-deployment-dz75p\" (UID: \"8579b92c-53fb-4e67-af9b-40881365b520\") " pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" Oct 03 18:46:53 crc kubenswrapper[4835]: I1003 18:46:53.227408 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" Oct 03 18:46:53 crc kubenswrapper[4835]: I1003 18:46:53.727701 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dz75p"] Oct 03 18:46:53 crc kubenswrapper[4835]: I1003 18:46:53.834519 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" event={"ID":"8579b92c-53fb-4e67-af9b-40881365b520","Type":"ContainerStarted","Data":"bf0fa117522a8f54dfc5e65450ba7e6c6e09ecdac62fc67d7acc25b5a40b5825"} Oct 03 18:46:54 crc kubenswrapper[4835]: I1003 18:46:54.844005 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" event={"ID":"8579b92c-53fb-4e67-af9b-40881365b520","Type":"ContainerStarted","Data":"9eced2405dd693120d456e990f9ba3af5cfaddacb6e91e16a9c9dfd27d3f64f8"} Oct 03 18:46:54 crc kubenswrapper[4835]: I1003 18:46:54.867686 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" podStartSLOduration=2.439001768 podStartE2EDuration="2.86766786s" podCreationTimestamp="2025-10-03 18:46:52 +0000 UTC" firstStartedPulling="2025-10-03 18:46:53.73254574 +0000 UTC m=+1955.448486612" lastFinishedPulling="2025-10-03 18:46:54.161211822 +0000 UTC m=+1955.877152704" observedRunningTime="2025-10-03 18:46:54.859042677 +0000 UTC m=+1956.574983549" watchObservedRunningTime="2025-10-03 18:46:54.86766786 +0000 UTC m=+1956.583608732" Oct 03 18:47:01 crc kubenswrapper[4835]: I1003 18:47:01.917911 4835 generic.go:334] "Generic (PLEG): container finished" podID="8579b92c-53fb-4e67-af9b-40881365b520" containerID="9eced2405dd693120d456e990f9ba3af5cfaddacb6e91e16a9c9dfd27d3f64f8" exitCode=0 Oct 03 18:47:01 crc kubenswrapper[4835]: I1003 18:47:01.918010 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" event={"ID":"8579b92c-53fb-4e67-af9b-40881365b520","Type":"ContainerDied","Data":"9eced2405dd693120d456e990f9ba3af5cfaddacb6e91e16a9c9dfd27d3f64f8"} Oct 03 18:47:03 crc kubenswrapper[4835]: I1003 18:47:03.340325 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" Oct 03 18:47:03 crc kubenswrapper[4835]: I1003 18:47:03.369440 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmbsr\" (UniqueName: \"kubernetes.io/projected/8579b92c-53fb-4e67-af9b-40881365b520-kube-api-access-mmbsr\") pod \"8579b92c-53fb-4e67-af9b-40881365b520\" (UID: \"8579b92c-53fb-4e67-af9b-40881365b520\") " Oct 03 18:47:03 crc kubenswrapper[4835]: I1003 18:47:03.369482 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-ssh-key-openstack-edpm-ipam\") pod \"8579b92c-53fb-4e67-af9b-40881365b520\" (UID: \"8579b92c-53fb-4e67-af9b-40881365b520\") " Oct 03 18:47:03 crc kubenswrapper[4835]: I1003 18:47:03.369726 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-inventory-0\") pod \"8579b92c-53fb-4e67-af9b-40881365b520\" (UID: \"8579b92c-53fb-4e67-af9b-40881365b520\") " Oct 03 18:47:03 crc kubenswrapper[4835]: I1003 18:47:03.379950 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8579b92c-53fb-4e67-af9b-40881365b520-kube-api-access-mmbsr" (OuterVolumeSpecName: "kube-api-access-mmbsr") pod "8579b92c-53fb-4e67-af9b-40881365b520" (UID: "8579b92c-53fb-4e67-af9b-40881365b520"). InnerVolumeSpecName "kube-api-access-mmbsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:47:03 crc kubenswrapper[4835]: E1003 18:47:03.401929 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-ssh-key-openstack-edpm-ipam podName:8579b92c-53fb-4e67-af9b-40881365b520 nodeName:}" failed. No retries permitted until 2025-10-03 18:47:03.901905383 +0000 UTC m=+1965.617846255 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-ssh-key-openstack-edpm-ipam") pod "8579b92c-53fb-4e67-af9b-40881365b520" (UID: "8579b92c-53fb-4e67-af9b-40881365b520") : error deleting /var/lib/kubelet/pods/8579b92c-53fb-4e67-af9b-40881365b520/volume-subpaths: remove /var/lib/kubelet/pods/8579b92c-53fb-4e67-af9b-40881365b520/volume-subpaths: no such file or directory Oct 03 18:47:03 crc kubenswrapper[4835]: I1003 18:47:03.404752 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "8579b92c-53fb-4e67-af9b-40881365b520" (UID: "8579b92c-53fb-4e67-af9b-40881365b520"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:47:03 crc kubenswrapper[4835]: I1003 18:47:03.470821 4835 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:47:03 crc kubenswrapper[4835]: I1003 18:47:03.470847 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmbsr\" (UniqueName: \"kubernetes.io/projected/8579b92c-53fb-4e67-af9b-40881365b520-kube-api-access-mmbsr\") on node \"crc\" DevicePath \"\"" Oct 03 18:47:03 crc kubenswrapper[4835]: I1003 18:47:03.976705 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" event={"ID":"8579b92c-53fb-4e67-af9b-40881365b520","Type":"ContainerDied","Data":"bf0fa117522a8f54dfc5e65450ba7e6c6e09ecdac62fc67d7acc25b5a40b5825"} Oct 03 18:47:03 crc kubenswrapper[4835]: I1003 18:47:03.976759 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf0fa117522a8f54dfc5e65450ba7e6c6e09ecdac62fc67d7acc25b5a40b5825" Oct 03 18:47:03 crc kubenswrapper[4835]: I1003 18:47:03.976769 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dz75p" Oct 03 18:47:03 crc kubenswrapper[4835]: I1003 18:47:03.978612 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-ssh-key-openstack-edpm-ipam\") pod \"8579b92c-53fb-4e67-af9b-40881365b520\" (UID: \"8579b92c-53fb-4e67-af9b-40881365b520\") " Oct 03 18:47:03 crc kubenswrapper[4835]: I1003 18:47:03.982152 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8579b92c-53fb-4e67-af9b-40881365b520" (UID: "8579b92c-53fb-4e67-af9b-40881365b520"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.011772 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs"] Oct 03 18:47:04 crc kubenswrapper[4835]: E1003 18:47:04.012239 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8579b92c-53fb-4e67-af9b-40881365b520" containerName="ssh-known-hosts-edpm-deployment" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.012260 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8579b92c-53fb-4e67-af9b-40881365b520" containerName="ssh-known-hosts-edpm-deployment" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.012891 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8579b92c-53fb-4e67-af9b-40881365b520" containerName="ssh-known-hosts-edpm-deployment" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.013785 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.022452 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs"] Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.080659 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8579b92c-53fb-4e67-af9b-40881365b520-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.182501 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8q76\" (UniqueName: \"kubernetes.io/projected/94670192-4404-4657-8f2a-c493b239e2bd-kube-api-access-k8q76\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hzlrs\" (UID: \"94670192-4404-4657-8f2a-c493b239e2bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.182645 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94670192-4404-4657-8f2a-c493b239e2bd-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hzlrs\" (UID: \"94670192-4404-4657-8f2a-c493b239e2bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.182706 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94670192-4404-4657-8f2a-c493b239e2bd-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hzlrs\" (UID: \"94670192-4404-4657-8f2a-c493b239e2bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.284282 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8q76\" (UniqueName: \"kubernetes.io/projected/94670192-4404-4657-8f2a-c493b239e2bd-kube-api-access-k8q76\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hzlrs\" (UID: \"94670192-4404-4657-8f2a-c493b239e2bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.284368 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94670192-4404-4657-8f2a-c493b239e2bd-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hzlrs\" (UID: \"94670192-4404-4657-8f2a-c493b239e2bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.284409 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94670192-4404-4657-8f2a-c493b239e2bd-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hzlrs\" (UID: \"94670192-4404-4657-8f2a-c493b239e2bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.288684 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94670192-4404-4657-8f2a-c493b239e2bd-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hzlrs\" (UID: \"94670192-4404-4657-8f2a-c493b239e2bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.289613 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94670192-4404-4657-8f2a-c493b239e2bd-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hzlrs\" (UID: \"94670192-4404-4657-8f2a-c493b239e2bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.300043 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8q76\" (UniqueName: \"kubernetes.io/projected/94670192-4404-4657-8f2a-c493b239e2bd-kube-api-access-k8q76\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hzlrs\" (UID: \"94670192-4404-4657-8f2a-c493b239e2bd\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.355903 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.892639 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs"] Oct 03 18:47:04 crc kubenswrapper[4835]: I1003 18:47:04.985163 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" event={"ID":"94670192-4404-4657-8f2a-c493b239e2bd","Type":"ContainerStarted","Data":"55c21f40d70e87ad23234dd4f99d6d5cc708b98dde4292a888117e2514539558"} Oct 03 18:47:05 crc kubenswrapper[4835]: I1003 18:47:05.994291 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" event={"ID":"94670192-4404-4657-8f2a-c493b239e2bd","Type":"ContainerStarted","Data":"64b8314463cc9fef1556fb9d1a1756f85dceb34392d34818496cf968aaca7d31"} Oct 03 18:47:06 crc kubenswrapper[4835]: I1003 18:47:06.015381 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" podStartSLOduration=2.527065071 podStartE2EDuration="3.015366364s" podCreationTimestamp="2025-10-03 18:47:03 +0000 UTC" firstStartedPulling="2025-10-03 18:47:04.895731087 +0000 UTC m=+1966.611671949" lastFinishedPulling="2025-10-03 18:47:05.38403237 +0000 UTC m=+1967.099973242" observedRunningTime="2025-10-03 18:47:06.008014993 +0000 UTC m=+1967.723955865" watchObservedRunningTime="2025-10-03 18:47:06.015366364 +0000 UTC m=+1967.731307236" Oct 03 18:47:13 crc kubenswrapper[4835]: I1003 18:47:13.277540 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lstws"] Oct 03 18:47:13 crc kubenswrapper[4835]: I1003 18:47:13.280216 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:13 crc kubenswrapper[4835]: I1003 18:47:13.286753 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lstws"] Oct 03 18:47:13 crc kubenswrapper[4835]: I1003 18:47:13.360484 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f12e17-4d13-4099-b8c2-12b224ace2a6-utilities\") pod \"redhat-marketplace-lstws\" (UID: \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\") " pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:13 crc kubenswrapper[4835]: I1003 18:47:13.361114 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn5vz\" (UniqueName: \"kubernetes.io/projected/b3f12e17-4d13-4099-b8c2-12b224ace2a6-kube-api-access-dn5vz\") pod \"redhat-marketplace-lstws\" (UID: \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\") " pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:13 crc kubenswrapper[4835]: I1003 18:47:13.361621 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f12e17-4d13-4099-b8c2-12b224ace2a6-catalog-content\") pod \"redhat-marketplace-lstws\" (UID: \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\") " pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:13 crc kubenswrapper[4835]: I1003 18:47:13.465324 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f12e17-4d13-4099-b8c2-12b224ace2a6-catalog-content\") pod \"redhat-marketplace-lstws\" (UID: \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\") " pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:13 crc kubenswrapper[4835]: I1003 18:47:13.465490 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f12e17-4d13-4099-b8c2-12b224ace2a6-utilities\") pod \"redhat-marketplace-lstws\" (UID: \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\") " pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:13 crc kubenswrapper[4835]: I1003 18:47:13.465568 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn5vz\" (UniqueName: \"kubernetes.io/projected/b3f12e17-4d13-4099-b8c2-12b224ace2a6-kube-api-access-dn5vz\") pod \"redhat-marketplace-lstws\" (UID: \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\") " pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:13 crc kubenswrapper[4835]: I1003 18:47:13.466523 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f12e17-4d13-4099-b8c2-12b224ace2a6-catalog-content\") pod \"redhat-marketplace-lstws\" (UID: \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\") " pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:13 crc kubenswrapper[4835]: I1003 18:47:13.466561 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f12e17-4d13-4099-b8c2-12b224ace2a6-utilities\") pod \"redhat-marketplace-lstws\" (UID: \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\") " pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:13 crc kubenswrapper[4835]: I1003 18:47:13.491130 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn5vz\" (UniqueName: \"kubernetes.io/projected/b3f12e17-4d13-4099-b8c2-12b224ace2a6-kube-api-access-dn5vz\") pod \"redhat-marketplace-lstws\" (UID: \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\") " pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:13 crc kubenswrapper[4835]: I1003 18:47:13.601690 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:14 crc kubenswrapper[4835]: I1003 18:47:14.040619 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lstws"] Oct 03 18:47:14 crc kubenswrapper[4835]: I1003 18:47:14.064863 4835 generic.go:334] "Generic (PLEG): container finished" podID="94670192-4404-4657-8f2a-c493b239e2bd" containerID="64b8314463cc9fef1556fb9d1a1756f85dceb34392d34818496cf968aaca7d31" exitCode=0 Oct 03 18:47:14 crc kubenswrapper[4835]: I1003 18:47:14.064944 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" event={"ID":"94670192-4404-4657-8f2a-c493b239e2bd","Type":"ContainerDied","Data":"64b8314463cc9fef1556fb9d1a1756f85dceb34392d34818496cf968aaca7d31"} Oct 03 18:47:14 crc kubenswrapper[4835]: I1003 18:47:14.066913 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lstws" event={"ID":"b3f12e17-4d13-4099-b8c2-12b224ace2a6","Type":"ContainerStarted","Data":"150f9caf37b34797ad5e8773a98accac0163687e262562048e677cd70085924c"} Oct 03 18:47:15 crc kubenswrapper[4835]: I1003 18:47:15.079805 4835 generic.go:334] "Generic (PLEG): container finished" podID="b3f12e17-4d13-4099-b8c2-12b224ace2a6" containerID="b4ddd1a0878465623ca47a47bbccbfc38df09777e43e9cef4871b718deb100cd" exitCode=0 Oct 03 18:47:15 crc kubenswrapper[4835]: I1003 18:47:15.079992 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lstws" event={"ID":"b3f12e17-4d13-4099-b8c2-12b224ace2a6","Type":"ContainerDied","Data":"b4ddd1a0878465623ca47a47bbccbfc38df09777e43e9cef4871b718deb100cd"} Oct 03 18:47:15 crc kubenswrapper[4835]: I1003 18:47:15.527023 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" Oct 03 18:47:15 crc kubenswrapper[4835]: I1003 18:47:15.611835 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94670192-4404-4657-8f2a-c493b239e2bd-ssh-key\") pod \"94670192-4404-4657-8f2a-c493b239e2bd\" (UID: \"94670192-4404-4657-8f2a-c493b239e2bd\") " Oct 03 18:47:15 crc kubenswrapper[4835]: I1003 18:47:15.612036 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8q76\" (UniqueName: \"kubernetes.io/projected/94670192-4404-4657-8f2a-c493b239e2bd-kube-api-access-k8q76\") pod \"94670192-4404-4657-8f2a-c493b239e2bd\" (UID: \"94670192-4404-4657-8f2a-c493b239e2bd\") " Oct 03 18:47:15 crc kubenswrapper[4835]: I1003 18:47:15.612172 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94670192-4404-4657-8f2a-c493b239e2bd-inventory\") pod \"94670192-4404-4657-8f2a-c493b239e2bd\" (UID: \"94670192-4404-4657-8f2a-c493b239e2bd\") " Oct 03 18:47:15 crc kubenswrapper[4835]: I1003 18:47:15.618148 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94670192-4404-4657-8f2a-c493b239e2bd-kube-api-access-k8q76" (OuterVolumeSpecName: "kube-api-access-k8q76") pod "94670192-4404-4657-8f2a-c493b239e2bd" (UID: "94670192-4404-4657-8f2a-c493b239e2bd"). InnerVolumeSpecName "kube-api-access-k8q76". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:47:15 crc kubenswrapper[4835]: I1003 18:47:15.645606 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94670192-4404-4657-8f2a-c493b239e2bd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "94670192-4404-4657-8f2a-c493b239e2bd" (UID: "94670192-4404-4657-8f2a-c493b239e2bd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:47:15 crc kubenswrapper[4835]: I1003 18:47:15.663612 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94670192-4404-4657-8f2a-c493b239e2bd-inventory" (OuterVolumeSpecName: "inventory") pod "94670192-4404-4657-8f2a-c493b239e2bd" (UID: "94670192-4404-4657-8f2a-c493b239e2bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:47:15 crc kubenswrapper[4835]: I1003 18:47:15.713898 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8q76\" (UniqueName: \"kubernetes.io/projected/94670192-4404-4657-8f2a-c493b239e2bd-kube-api-access-k8q76\") on node \"crc\" DevicePath \"\"" Oct 03 18:47:15 crc kubenswrapper[4835]: I1003 18:47:15.714038 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94670192-4404-4657-8f2a-c493b239e2bd-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:47:15 crc kubenswrapper[4835]: I1003 18:47:15.714112 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94670192-4404-4657-8f2a-c493b239e2bd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.089206 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" event={"ID":"94670192-4404-4657-8f2a-c493b239e2bd","Type":"ContainerDied","Data":"55c21f40d70e87ad23234dd4f99d6d5cc708b98dde4292a888117e2514539558"} Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.089447 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55c21f40d70e87ad23234dd4f99d6d5cc708b98dde4292a888117e2514539558" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.089268 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hzlrs" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.091509 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lstws" event={"ID":"b3f12e17-4d13-4099-b8c2-12b224ace2a6","Type":"ContainerStarted","Data":"4143047a90907333e566bab8e23526b97e7d2fc61e0f9b680324438896c9d481"} Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.159768 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb"] Oct 03 18:47:16 crc kubenswrapper[4835]: E1003 18:47:16.160403 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94670192-4404-4657-8f2a-c493b239e2bd" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.160436 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="94670192-4404-4657-8f2a-c493b239e2bd" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.160743 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="94670192-4404-4657-8f2a-c493b239e2bd" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.161934 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.173755 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.176629 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.176997 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.177110 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.187913 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb"] Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.324743 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fz9c\" (UniqueName: \"kubernetes.io/projected/ff850717-c781-4037-81d1-d5538ea47f65-kube-api-access-4fz9c\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb\" (UID: \"ff850717-c781-4037-81d1-d5538ea47f65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.325058 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff850717-c781-4037-81d1-d5538ea47f65-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb\" (UID: \"ff850717-c781-4037-81d1-d5538ea47f65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.325129 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff850717-c781-4037-81d1-d5538ea47f65-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb\" (UID: \"ff850717-c781-4037-81d1-d5538ea47f65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.427799 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff850717-c781-4037-81d1-d5538ea47f65-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb\" (UID: \"ff850717-c781-4037-81d1-d5538ea47f65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.427936 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff850717-c781-4037-81d1-d5538ea47f65-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb\" (UID: \"ff850717-c781-4037-81d1-d5538ea47f65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.428385 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fz9c\" (UniqueName: \"kubernetes.io/projected/ff850717-c781-4037-81d1-d5538ea47f65-kube-api-access-4fz9c\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb\" (UID: \"ff850717-c781-4037-81d1-d5538ea47f65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.434795 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff850717-c781-4037-81d1-d5538ea47f65-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb\" (UID: \"ff850717-c781-4037-81d1-d5538ea47f65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.436610 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff850717-c781-4037-81d1-d5538ea47f65-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb\" (UID: \"ff850717-c781-4037-81d1-d5538ea47f65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.444491 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fz9c\" (UniqueName: \"kubernetes.io/projected/ff850717-c781-4037-81d1-d5538ea47f65-kube-api-access-4fz9c\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb\" (UID: \"ff850717-c781-4037-81d1-d5538ea47f65\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" Oct 03 18:47:16 crc kubenswrapper[4835]: I1003 18:47:16.548171 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" Oct 03 18:47:17 crc kubenswrapper[4835]: I1003 18:47:17.078299 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb"] Oct 03 18:47:17 crc kubenswrapper[4835]: W1003 18:47:17.080905 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff850717_c781_4037_81d1_d5538ea47f65.slice/crio-b4d71e5f1e28b0a872936741b3a0bc67bee432d1c6aec28f18cba408468c8214 WatchSource:0}: Error finding container b4d71e5f1e28b0a872936741b3a0bc67bee432d1c6aec28f18cba408468c8214: Status 404 returned error can't find the container with id b4d71e5f1e28b0a872936741b3a0bc67bee432d1c6aec28f18cba408468c8214 Oct 03 18:47:17 crc kubenswrapper[4835]: I1003 18:47:17.101845 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" event={"ID":"ff850717-c781-4037-81d1-d5538ea47f65","Type":"ContainerStarted","Data":"b4d71e5f1e28b0a872936741b3a0bc67bee432d1c6aec28f18cba408468c8214"} Oct 03 18:47:17 crc kubenswrapper[4835]: I1003 18:47:17.104478 4835 generic.go:334] "Generic (PLEG): container finished" podID="b3f12e17-4d13-4099-b8c2-12b224ace2a6" containerID="4143047a90907333e566bab8e23526b97e7d2fc61e0f9b680324438896c9d481" exitCode=0 Oct 03 18:47:17 crc kubenswrapper[4835]: I1003 18:47:17.104551 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lstws" event={"ID":"b3f12e17-4d13-4099-b8c2-12b224ace2a6","Type":"ContainerDied","Data":"4143047a90907333e566bab8e23526b97e7d2fc61e0f9b680324438896c9d481"} Oct 03 18:47:18 crc kubenswrapper[4835]: I1003 18:47:18.113190 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" event={"ID":"ff850717-c781-4037-81d1-d5538ea47f65","Type":"ContainerStarted","Data":"2aa5f2c6fd83712a94cae62c9f278a4d526f521f6dfe2ab94535d61e56f2d179"} Oct 03 18:47:18 crc kubenswrapper[4835]: I1003 18:47:18.117607 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lstws" event={"ID":"b3f12e17-4d13-4099-b8c2-12b224ace2a6","Type":"ContainerStarted","Data":"c98d593fc08862fb9eeecfe83dfad463ac4d354afab951d9a4fe0387184d36ed"} Oct 03 18:47:18 crc kubenswrapper[4835]: I1003 18:47:18.132057 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" podStartSLOduration=1.611062336 podStartE2EDuration="2.132039646s" podCreationTimestamp="2025-10-03 18:47:16 +0000 UTC" firstStartedPulling="2025-10-03 18:47:17.084480158 +0000 UTC m=+1978.800421040" lastFinishedPulling="2025-10-03 18:47:17.605457478 +0000 UTC m=+1979.321398350" observedRunningTime="2025-10-03 18:47:18.128881888 +0000 UTC m=+1979.844822770" watchObservedRunningTime="2025-10-03 18:47:18.132039646 +0000 UTC m=+1979.847980518" Oct 03 18:47:18 crc kubenswrapper[4835]: I1003 18:47:18.155506 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lstws" podStartSLOduration=2.731743886 podStartE2EDuration="5.155488035s" podCreationTimestamp="2025-10-03 18:47:13 +0000 UTC" firstStartedPulling="2025-10-03 18:47:15.092552858 +0000 UTC m=+1976.808493730" lastFinishedPulling="2025-10-03 18:47:17.516297007 +0000 UTC m=+1979.232237879" observedRunningTime="2025-10-03 18:47:18.149533197 +0000 UTC m=+1979.865474079" watchObservedRunningTime="2025-10-03 18:47:18.155488035 +0000 UTC m=+1979.871428917" Oct 03 18:47:23 crc kubenswrapper[4835]: I1003 18:47:23.602167 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:23 crc kubenswrapper[4835]: I1003 18:47:23.602735 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:23 crc kubenswrapper[4835]: I1003 18:47:23.657631 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:24 crc kubenswrapper[4835]: I1003 18:47:24.216371 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:24 crc kubenswrapper[4835]: I1003 18:47:24.270375 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lstws"] Oct 03 18:47:26 crc kubenswrapper[4835]: I1003 18:47:26.182686 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lstws" podUID="b3f12e17-4d13-4099-b8c2-12b224ace2a6" containerName="registry-server" containerID="cri-o://c98d593fc08862fb9eeecfe83dfad463ac4d354afab951d9a4fe0387184d36ed" gracePeriod=2 Oct 03 18:47:26 crc kubenswrapper[4835]: I1003 18:47:26.719136 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:26 crc kubenswrapper[4835]: I1003 18:47:26.834815 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f12e17-4d13-4099-b8c2-12b224ace2a6-catalog-content\") pod \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\" (UID: \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\") " Oct 03 18:47:26 crc kubenswrapper[4835]: I1003 18:47:26.835017 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn5vz\" (UniqueName: \"kubernetes.io/projected/b3f12e17-4d13-4099-b8c2-12b224ace2a6-kube-api-access-dn5vz\") pod \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\" (UID: \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\") " Oct 03 18:47:26 crc kubenswrapper[4835]: I1003 18:47:26.835082 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f12e17-4d13-4099-b8c2-12b224ace2a6-utilities\") pod \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\" (UID: \"b3f12e17-4d13-4099-b8c2-12b224ace2a6\") " Oct 03 18:47:26 crc kubenswrapper[4835]: I1003 18:47:26.836001 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3f12e17-4d13-4099-b8c2-12b224ace2a6-utilities" (OuterVolumeSpecName: "utilities") pod "b3f12e17-4d13-4099-b8c2-12b224ace2a6" (UID: "b3f12e17-4d13-4099-b8c2-12b224ace2a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:47:26 crc kubenswrapper[4835]: I1003 18:47:26.846058 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3f12e17-4d13-4099-b8c2-12b224ace2a6-kube-api-access-dn5vz" (OuterVolumeSpecName: "kube-api-access-dn5vz") pod "b3f12e17-4d13-4099-b8c2-12b224ace2a6" (UID: "b3f12e17-4d13-4099-b8c2-12b224ace2a6"). InnerVolumeSpecName "kube-api-access-dn5vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:47:26 crc kubenswrapper[4835]: I1003 18:47:26.852587 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3f12e17-4d13-4099-b8c2-12b224ace2a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3f12e17-4d13-4099-b8c2-12b224ace2a6" (UID: "b3f12e17-4d13-4099-b8c2-12b224ace2a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:47:26 crc kubenswrapper[4835]: I1003 18:47:26.936796 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f12e17-4d13-4099-b8c2-12b224ace2a6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:47:26 crc kubenswrapper[4835]: I1003 18:47:26.936848 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn5vz\" (UniqueName: \"kubernetes.io/projected/b3f12e17-4d13-4099-b8c2-12b224ace2a6-kube-api-access-dn5vz\") on node \"crc\" DevicePath \"\"" Oct 03 18:47:26 crc kubenswrapper[4835]: I1003 18:47:26.936859 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f12e17-4d13-4099-b8c2-12b224ace2a6-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:47:27 crc kubenswrapper[4835]: E1003 18:47:27.005036 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3f12e17_4d13_4099_b8c2_12b224ace2a6.slice\": RecentStats: unable to find data in memory cache]" Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.197252 4835 generic.go:334] "Generic (PLEG): container finished" podID="b3f12e17-4d13-4099-b8c2-12b224ace2a6" containerID="c98d593fc08862fb9eeecfe83dfad463ac4d354afab951d9a4fe0387184d36ed" exitCode=0 Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.197308 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lstws" Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.197358 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lstws" event={"ID":"b3f12e17-4d13-4099-b8c2-12b224ace2a6","Type":"ContainerDied","Data":"c98d593fc08862fb9eeecfe83dfad463ac4d354afab951d9a4fe0387184d36ed"} Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.197402 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lstws" event={"ID":"b3f12e17-4d13-4099-b8c2-12b224ace2a6","Type":"ContainerDied","Data":"150f9caf37b34797ad5e8773a98accac0163687e262562048e677cd70085924c"} Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.197431 4835 scope.go:117] "RemoveContainer" containerID="c98d593fc08862fb9eeecfe83dfad463ac4d354afab951d9a4fe0387184d36ed" Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.200659 4835 generic.go:334] "Generic (PLEG): container finished" podID="ff850717-c781-4037-81d1-d5538ea47f65" containerID="2aa5f2c6fd83712a94cae62c9f278a4d526f521f6dfe2ab94535d61e56f2d179" exitCode=0 Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.200696 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" event={"ID":"ff850717-c781-4037-81d1-d5538ea47f65","Type":"ContainerDied","Data":"2aa5f2c6fd83712a94cae62c9f278a4d526f521f6dfe2ab94535d61e56f2d179"} Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.251491 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lstws"] Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.255963 4835 scope.go:117] "RemoveContainer" containerID="4143047a90907333e566bab8e23526b97e7d2fc61e0f9b680324438896c9d481" Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.260272 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lstws"] Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.280966 4835 scope.go:117] "RemoveContainer" containerID="b4ddd1a0878465623ca47a47bbccbfc38df09777e43e9cef4871b718deb100cd" Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.348421 4835 scope.go:117] "RemoveContainer" containerID="c98d593fc08862fb9eeecfe83dfad463ac4d354afab951d9a4fe0387184d36ed" Oct 03 18:47:27 crc kubenswrapper[4835]: E1003 18:47:27.349174 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c98d593fc08862fb9eeecfe83dfad463ac4d354afab951d9a4fe0387184d36ed\": container with ID starting with c98d593fc08862fb9eeecfe83dfad463ac4d354afab951d9a4fe0387184d36ed not found: ID does not exist" containerID="c98d593fc08862fb9eeecfe83dfad463ac4d354afab951d9a4fe0387184d36ed" Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.349217 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98d593fc08862fb9eeecfe83dfad463ac4d354afab951d9a4fe0387184d36ed"} err="failed to get container status \"c98d593fc08862fb9eeecfe83dfad463ac4d354afab951d9a4fe0387184d36ed\": rpc error: code = NotFound desc = could not find container \"c98d593fc08862fb9eeecfe83dfad463ac4d354afab951d9a4fe0387184d36ed\": container with ID starting with c98d593fc08862fb9eeecfe83dfad463ac4d354afab951d9a4fe0387184d36ed not found: ID does not exist" Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.349244 4835 scope.go:117] "RemoveContainer" containerID="4143047a90907333e566bab8e23526b97e7d2fc61e0f9b680324438896c9d481" Oct 03 18:47:27 crc kubenswrapper[4835]: E1003 18:47:27.349700 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4143047a90907333e566bab8e23526b97e7d2fc61e0f9b680324438896c9d481\": container with ID starting with 4143047a90907333e566bab8e23526b97e7d2fc61e0f9b680324438896c9d481 not found: ID does not exist" containerID="4143047a90907333e566bab8e23526b97e7d2fc61e0f9b680324438896c9d481" Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.349750 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4143047a90907333e566bab8e23526b97e7d2fc61e0f9b680324438896c9d481"} err="failed to get container status \"4143047a90907333e566bab8e23526b97e7d2fc61e0f9b680324438896c9d481\": rpc error: code = NotFound desc = could not find container \"4143047a90907333e566bab8e23526b97e7d2fc61e0f9b680324438896c9d481\": container with ID starting with 4143047a90907333e566bab8e23526b97e7d2fc61e0f9b680324438896c9d481 not found: ID does not exist" Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.349780 4835 scope.go:117] "RemoveContainer" containerID="b4ddd1a0878465623ca47a47bbccbfc38df09777e43e9cef4871b718deb100cd" Oct 03 18:47:27 crc kubenswrapper[4835]: E1003 18:47:27.350295 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4ddd1a0878465623ca47a47bbccbfc38df09777e43e9cef4871b718deb100cd\": container with ID starting with b4ddd1a0878465623ca47a47bbccbfc38df09777e43e9cef4871b718deb100cd not found: ID does not exist" containerID="b4ddd1a0878465623ca47a47bbccbfc38df09777e43e9cef4871b718deb100cd" Oct 03 18:47:27 crc kubenswrapper[4835]: I1003 18:47:27.350325 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ddd1a0878465623ca47a47bbccbfc38df09777e43e9cef4871b718deb100cd"} err="failed to get container status \"b4ddd1a0878465623ca47a47bbccbfc38df09777e43e9cef4871b718deb100cd\": rpc error: code = NotFound desc = could not find container \"b4ddd1a0878465623ca47a47bbccbfc38df09777e43e9cef4871b718deb100cd\": container with ID starting with b4ddd1a0878465623ca47a47bbccbfc38df09777e43e9cef4871b718deb100cd not found: ID does not exist" Oct 03 18:47:28 crc kubenswrapper[4835]: I1003 18:47:28.669750 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" Oct 03 18:47:28 crc kubenswrapper[4835]: I1003 18:47:28.774064 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fz9c\" (UniqueName: \"kubernetes.io/projected/ff850717-c781-4037-81d1-d5538ea47f65-kube-api-access-4fz9c\") pod \"ff850717-c781-4037-81d1-d5538ea47f65\" (UID: \"ff850717-c781-4037-81d1-d5538ea47f65\") " Oct 03 18:47:28 crc kubenswrapper[4835]: I1003 18:47:28.774157 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff850717-c781-4037-81d1-d5538ea47f65-inventory\") pod \"ff850717-c781-4037-81d1-d5538ea47f65\" (UID: \"ff850717-c781-4037-81d1-d5538ea47f65\") " Oct 03 18:47:28 crc kubenswrapper[4835]: I1003 18:47:28.774249 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff850717-c781-4037-81d1-d5538ea47f65-ssh-key\") pod \"ff850717-c781-4037-81d1-d5538ea47f65\" (UID: \"ff850717-c781-4037-81d1-d5538ea47f65\") " Oct 03 18:47:28 crc kubenswrapper[4835]: I1003 18:47:28.787434 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff850717-c781-4037-81d1-d5538ea47f65-kube-api-access-4fz9c" (OuterVolumeSpecName: "kube-api-access-4fz9c") pod "ff850717-c781-4037-81d1-d5538ea47f65" (UID: "ff850717-c781-4037-81d1-d5538ea47f65"). InnerVolumeSpecName "kube-api-access-4fz9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:47:28 crc kubenswrapper[4835]: I1003 18:47:28.800309 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff850717-c781-4037-81d1-d5538ea47f65-inventory" (OuterVolumeSpecName: "inventory") pod "ff850717-c781-4037-81d1-d5538ea47f65" (UID: "ff850717-c781-4037-81d1-d5538ea47f65"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:47:28 crc kubenswrapper[4835]: I1003 18:47:28.810487 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff850717-c781-4037-81d1-d5538ea47f65-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ff850717-c781-4037-81d1-d5538ea47f65" (UID: "ff850717-c781-4037-81d1-d5538ea47f65"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:47:28 crc kubenswrapper[4835]: I1003 18:47:28.880147 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fz9c\" (UniqueName: \"kubernetes.io/projected/ff850717-c781-4037-81d1-d5538ea47f65-kube-api-access-4fz9c\") on node \"crc\" DevicePath \"\"" Oct 03 18:47:28 crc kubenswrapper[4835]: I1003 18:47:28.880177 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff850717-c781-4037-81d1-d5538ea47f65-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:47:28 crc kubenswrapper[4835]: I1003 18:47:28.880189 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ff850717-c781-4037-81d1-d5538ea47f65-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:47:28 crc kubenswrapper[4835]: I1003 18:47:28.888377 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3f12e17-4d13-4099-b8c2-12b224ace2a6" path="/var/lib/kubelet/pods/b3f12e17-4d13-4099-b8c2-12b224ace2a6/volumes" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.241180 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" event={"ID":"ff850717-c781-4037-81d1-d5538ea47f65","Type":"ContainerDied","Data":"b4d71e5f1e28b0a872936741b3a0bc67bee432d1c6aec28f18cba408468c8214"} Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.241235 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4d71e5f1e28b0a872936741b3a0bc67bee432d1c6aec28f18cba408468c8214" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.241315 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.301983 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z"] Oct 03 18:47:29 crc kubenswrapper[4835]: E1003 18:47:29.302437 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f12e17-4d13-4099-b8c2-12b224ace2a6" containerName="extract-utilities" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.302565 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f12e17-4d13-4099-b8c2-12b224ace2a6" containerName="extract-utilities" Oct 03 18:47:29 crc kubenswrapper[4835]: E1003 18:47:29.302587 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f12e17-4d13-4099-b8c2-12b224ace2a6" containerName="extract-content" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.302595 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f12e17-4d13-4099-b8c2-12b224ace2a6" containerName="extract-content" Oct 03 18:47:29 crc kubenswrapper[4835]: E1003 18:47:29.302614 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f12e17-4d13-4099-b8c2-12b224ace2a6" containerName="registry-server" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.302621 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f12e17-4d13-4099-b8c2-12b224ace2a6" containerName="registry-server" Oct 03 18:47:29 crc kubenswrapper[4835]: E1003 18:47:29.302647 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff850717-c781-4037-81d1-d5538ea47f65" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.302653 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff850717-c781-4037-81d1-d5538ea47f65" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.302829 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff850717-c781-4037-81d1-d5538ea47f65" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.302849 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3f12e17-4d13-4099-b8c2-12b224ace2a6" containerName="registry-server" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.303612 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.305780 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.306485 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.307051 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.307144 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.307586 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.307646 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.317506 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.321701 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.327966 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z"] Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.391046 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.391133 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.391166 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.391187 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.391247 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.391364 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.391447 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.391478 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.391759 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.391879 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.391950 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjkbp\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-kube-api-access-mjkbp\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.392140 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.392178 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.392205 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494289 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494335 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494356 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494401 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494427 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494448 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494465 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494496 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494542 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494566 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494585 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494627 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494651 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.494669 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjkbp\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-kube-api-access-mjkbp\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.498806 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.498970 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.499791 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.499836 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.500103 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.500796 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.501655 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.501711 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.502148 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.502164 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.502499 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.503356 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.510482 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.510793 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjkbp\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-kube-api-access-mjkbp\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7l67z\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:29 crc kubenswrapper[4835]: I1003 18:47:29.622286 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:47:30 crc kubenswrapper[4835]: I1003 18:47:30.125739 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z"] Oct 03 18:47:30 crc kubenswrapper[4835]: I1003 18:47:30.250621 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" event={"ID":"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2","Type":"ContainerStarted","Data":"15003d89661bf1980da6c3d60a349305712133ba336d60c682cf26065e72cc5f"} Oct 03 18:47:31 crc kubenswrapper[4835]: I1003 18:47:31.268454 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" event={"ID":"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2","Type":"ContainerStarted","Data":"cbade086f4c9ac26df8380023d837dc8b5c05aad379ed42dd643496af6e35b7b"} Oct 03 18:47:31 crc kubenswrapper[4835]: I1003 18:47:31.289973 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" podStartSLOduration=1.428329011 podStartE2EDuration="2.289957999s" podCreationTimestamp="2025-10-03 18:47:29 +0000 UTC" firstStartedPulling="2025-10-03 18:47:30.13941372 +0000 UTC m=+1991.855354592" lastFinishedPulling="2025-10-03 18:47:31.001042698 +0000 UTC m=+1992.716983580" observedRunningTime="2025-10-03 18:47:31.286481724 +0000 UTC m=+1993.002422596" watchObservedRunningTime="2025-10-03 18:47:31.289957999 +0000 UTC m=+1993.005898871" Oct 03 18:47:35 crc kubenswrapper[4835]: I1003 18:47:35.358548 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:47:35 crc kubenswrapper[4835]: I1003 18:47:35.359104 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:48:05 crc kubenswrapper[4835]: I1003 18:48:05.358697 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:48:05 crc kubenswrapper[4835]: I1003 18:48:05.359140 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:48:10 crc kubenswrapper[4835]: I1003 18:48:10.595280 4835 generic.go:334] "Generic (PLEG): container finished" podID="e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" containerID="cbade086f4c9ac26df8380023d837dc8b5c05aad379ed42dd643496af6e35b7b" exitCode=0 Oct 03 18:48:10 crc kubenswrapper[4835]: I1003 18:48:10.595334 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" event={"ID":"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2","Type":"ContainerDied","Data":"cbade086f4c9ac26df8380023d837dc8b5c05aad379ed42dd643496af6e35b7b"} Oct 03 18:48:11 crc kubenswrapper[4835]: I1003 18:48:11.983135 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.126802 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-ssh-key\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.126901 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-bootstrap-combined-ca-bundle\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.126944 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-nova-combined-ca-bundle\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.126995 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.127012 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-libvirt-combined-ca-bundle\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.127049 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.127107 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-neutron-metadata-combined-ca-bundle\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.127149 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-ovn-combined-ca-bundle\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.127170 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.127203 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-repo-setup-combined-ca-bundle\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.127225 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.127243 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjkbp\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-kube-api-access-mjkbp\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.127301 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-telemetry-combined-ca-bundle\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.127318 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-inventory\") pod \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\" (UID: \"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2\") " Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.133745 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.133858 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.134244 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.134568 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-kube-api-access-mjkbp" (OuterVolumeSpecName: "kube-api-access-mjkbp") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "kube-api-access-mjkbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.134723 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.135115 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.136880 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.136959 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.137932 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.138420 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.138587 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.139710 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.159909 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-inventory" (OuterVolumeSpecName: "inventory") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.160641 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" (UID: "e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.229856 4835 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.229890 4835 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.229900 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.229911 4835 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.229923 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.229933 4835 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.229944 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.229953 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.229963 4835 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.229972 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.229981 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjkbp\" (UniqueName: \"kubernetes.io/projected/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-kube-api-access-mjkbp\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.229990 4835 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.229998 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.230005 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.611995 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" event={"ID":"e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2","Type":"ContainerDied","Data":"15003d89661bf1980da6c3d60a349305712133ba336d60c682cf26065e72cc5f"} Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.612034 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15003d89661bf1980da6c3d60a349305712133ba336d60c682cf26065e72cc5f" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.612051 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7l67z" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.779142 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq"] Oct 03 18:48:12 crc kubenswrapper[4835]: E1003 18:48:12.779653 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.779675 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.780845 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.781623 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.783953 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.784381 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.784624 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.784773 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.787036 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.792924 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq"] Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.942235 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.942282 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.942325 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.942654 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rk7r\" (UniqueName: \"kubernetes.io/projected/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-kube-api-access-5rk7r\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:12 crc kubenswrapper[4835]: I1003 18:48:12.942849 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:13 crc kubenswrapper[4835]: I1003 18:48:13.044666 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rk7r\" (UniqueName: \"kubernetes.io/projected/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-kube-api-access-5rk7r\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:13 crc kubenswrapper[4835]: I1003 18:48:13.044748 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:13 crc kubenswrapper[4835]: I1003 18:48:13.044789 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:13 crc kubenswrapper[4835]: I1003 18:48:13.044815 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:13 crc kubenswrapper[4835]: I1003 18:48:13.044853 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:13 crc kubenswrapper[4835]: I1003 18:48:13.045576 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:13 crc kubenswrapper[4835]: I1003 18:48:13.048313 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:13 crc kubenswrapper[4835]: I1003 18:48:13.048810 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:13 crc kubenswrapper[4835]: I1003 18:48:13.054722 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:13 crc kubenswrapper[4835]: I1003 18:48:13.066441 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rk7r\" (UniqueName: \"kubernetes.io/projected/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-kube-api-access-5rk7r\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jq6zq\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:13 crc kubenswrapper[4835]: I1003 18:48:13.097580 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:48:13 crc kubenswrapper[4835]: I1003 18:48:13.613646 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq"] Oct 03 18:48:14 crc kubenswrapper[4835]: I1003 18:48:14.631783 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" event={"ID":"f0d5e8ab-25a4-4213-9bb4-1e41116eab53","Type":"ContainerStarted","Data":"459b857c60eec13f8b021d4fe33ef8f6424df681b85332e7a49ffc5e3f9b4bbc"} Oct 03 18:48:14 crc kubenswrapper[4835]: I1003 18:48:14.632055 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" event={"ID":"f0d5e8ab-25a4-4213-9bb4-1e41116eab53","Type":"ContainerStarted","Data":"75596db96a571b1399a453ba181df8f4b2159bc41c03fd1ef1967e9ea94d2b2b"} Oct 03 18:48:14 crc kubenswrapper[4835]: I1003 18:48:14.649657 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" podStartSLOduration=2.083502767 podStartE2EDuration="2.649617352s" podCreationTimestamp="2025-10-03 18:48:12 +0000 UTC" firstStartedPulling="2025-10-03 18:48:13.616183672 +0000 UTC m=+2035.332124544" lastFinishedPulling="2025-10-03 18:48:14.182298257 +0000 UTC m=+2035.898239129" observedRunningTime="2025-10-03 18:48:14.646901585 +0000 UTC m=+2036.362842467" watchObservedRunningTime="2025-10-03 18:48:14.649617352 +0000 UTC m=+2036.365558234" Oct 03 18:48:23 crc kubenswrapper[4835]: I1003 18:48:23.272943 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v5wmk"] Oct 03 18:48:23 crc kubenswrapper[4835]: I1003 18:48:23.276157 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:23 crc kubenswrapper[4835]: I1003 18:48:23.281135 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v5wmk"] Oct 03 18:48:23 crc kubenswrapper[4835]: I1003 18:48:23.431005 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mkhg\" (UniqueName: \"kubernetes.io/projected/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-kube-api-access-6mkhg\") pod \"redhat-operators-v5wmk\" (UID: \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\") " pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:23 crc kubenswrapper[4835]: I1003 18:48:23.431327 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-utilities\") pod \"redhat-operators-v5wmk\" (UID: \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\") " pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:23 crc kubenswrapper[4835]: I1003 18:48:23.431602 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-catalog-content\") pod \"redhat-operators-v5wmk\" (UID: \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\") " pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:23 crc kubenswrapper[4835]: I1003 18:48:23.533921 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-catalog-content\") pod \"redhat-operators-v5wmk\" (UID: \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\") " pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:23 crc kubenswrapper[4835]: I1003 18:48:23.534045 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mkhg\" (UniqueName: \"kubernetes.io/projected/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-kube-api-access-6mkhg\") pod \"redhat-operators-v5wmk\" (UID: \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\") " pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:23 crc kubenswrapper[4835]: I1003 18:48:23.534162 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-utilities\") pod \"redhat-operators-v5wmk\" (UID: \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\") " pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:23 crc kubenswrapper[4835]: I1003 18:48:23.534727 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-utilities\") pod \"redhat-operators-v5wmk\" (UID: \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\") " pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:23 crc kubenswrapper[4835]: I1003 18:48:23.534999 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-catalog-content\") pod \"redhat-operators-v5wmk\" (UID: \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\") " pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:23 crc kubenswrapper[4835]: I1003 18:48:23.552060 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mkhg\" (UniqueName: \"kubernetes.io/projected/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-kube-api-access-6mkhg\") pod \"redhat-operators-v5wmk\" (UID: \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\") " pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:23 crc kubenswrapper[4835]: I1003 18:48:23.593285 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:24 crc kubenswrapper[4835]: I1003 18:48:24.047684 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v5wmk"] Oct 03 18:48:24 crc kubenswrapper[4835]: I1003 18:48:24.725209 4835 generic.go:334] "Generic (PLEG): container finished" podID="10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" containerID="383ba1af5df4fd8367cda3da2fc32ba3abbfbb6ee54dbee0811dca51fd9c8758" exitCode=0 Oct 03 18:48:24 crc kubenswrapper[4835]: I1003 18:48:24.725255 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5wmk" event={"ID":"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4","Type":"ContainerDied","Data":"383ba1af5df4fd8367cda3da2fc32ba3abbfbb6ee54dbee0811dca51fd9c8758"} Oct 03 18:48:24 crc kubenswrapper[4835]: I1003 18:48:24.725279 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5wmk" event={"ID":"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4","Type":"ContainerStarted","Data":"c16d5189cd673b228e67519c84d30dff7808f89f3c66431e9c2bbf776c70c8c3"} Oct 03 18:48:26 crc kubenswrapper[4835]: I1003 18:48:26.766512 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5wmk" event={"ID":"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4","Type":"ContainerStarted","Data":"3cbf89a16bed94400f897cbd0cac145351eb61a281557c85a501ea654cc9e4ed"} Oct 03 18:48:27 crc kubenswrapper[4835]: I1003 18:48:27.779300 4835 generic.go:334] "Generic (PLEG): container finished" podID="10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" containerID="3cbf89a16bed94400f897cbd0cac145351eb61a281557c85a501ea654cc9e4ed" exitCode=0 Oct 03 18:48:27 crc kubenswrapper[4835]: I1003 18:48:27.779496 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5wmk" event={"ID":"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4","Type":"ContainerDied","Data":"3cbf89a16bed94400f897cbd0cac145351eb61a281557c85a501ea654cc9e4ed"} Oct 03 18:48:28 crc kubenswrapper[4835]: I1003 18:48:28.791279 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5wmk" event={"ID":"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4","Type":"ContainerStarted","Data":"a672d10598e6c7fa39372188c5d675cfa3a7ac5bd47691de697fd6be2a098ed1"} Oct 03 18:48:28 crc kubenswrapper[4835]: I1003 18:48:28.818570 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v5wmk" podStartSLOduration=2.375867043 podStartE2EDuration="5.818548434s" podCreationTimestamp="2025-10-03 18:48:23 +0000 UTC" firstStartedPulling="2025-10-03 18:48:24.72752862 +0000 UTC m=+2046.443469492" lastFinishedPulling="2025-10-03 18:48:28.170210011 +0000 UTC m=+2049.886150883" observedRunningTime="2025-10-03 18:48:28.811971432 +0000 UTC m=+2050.527912304" watchObservedRunningTime="2025-10-03 18:48:28.818548434 +0000 UTC m=+2050.534489306" Oct 03 18:48:33 crc kubenswrapper[4835]: I1003 18:48:33.595019 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:33 crc kubenswrapper[4835]: I1003 18:48:33.595702 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:33 crc kubenswrapper[4835]: I1003 18:48:33.660650 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:33 crc kubenswrapper[4835]: I1003 18:48:33.886407 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:33 crc kubenswrapper[4835]: I1003 18:48:33.942479 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v5wmk"] Oct 03 18:48:35 crc kubenswrapper[4835]: I1003 18:48:35.358876 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:48:35 crc kubenswrapper[4835]: I1003 18:48:35.358946 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:48:35 crc kubenswrapper[4835]: I1003 18:48:35.358992 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:48:35 crc kubenswrapper[4835]: I1003 18:48:35.360454 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5deb0cf9b1410690f101cc5c84ba400be4c91a67e6bec74de8589a872a3c0d30"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 18:48:35 crc kubenswrapper[4835]: I1003 18:48:35.360561 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://5deb0cf9b1410690f101cc5c84ba400be4c91a67e6bec74de8589a872a3c0d30" gracePeriod=600 Oct 03 18:48:35 crc kubenswrapper[4835]: I1003 18:48:35.858791 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="5deb0cf9b1410690f101cc5c84ba400be4c91a67e6bec74de8589a872a3c0d30" exitCode=0 Oct 03 18:48:35 crc kubenswrapper[4835]: I1003 18:48:35.858865 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"5deb0cf9b1410690f101cc5c84ba400be4c91a67e6bec74de8589a872a3c0d30"} Oct 03 18:48:35 crc kubenswrapper[4835]: I1003 18:48:35.859155 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc"} Oct 03 18:48:35 crc kubenswrapper[4835]: I1003 18:48:35.859181 4835 scope.go:117] "RemoveContainer" containerID="df8b510b07d2e88952474616c0ceb7b94a28a8f7347638481bfe7fef28b2c0ce" Oct 03 18:48:35 crc kubenswrapper[4835]: I1003 18:48:35.859264 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v5wmk" podUID="10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" containerName="registry-server" containerID="cri-o://a672d10598e6c7fa39372188c5d675cfa3a7ac5bd47691de697fd6be2a098ed1" gracePeriod=2 Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.274534 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.405508 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-catalog-content\") pod \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\" (UID: \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\") " Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.405955 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-utilities\") pod \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\" (UID: \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\") " Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.406174 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mkhg\" (UniqueName: \"kubernetes.io/projected/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-kube-api-access-6mkhg\") pod \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\" (UID: \"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4\") " Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.406559 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-utilities" (OuterVolumeSpecName: "utilities") pod "10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" (UID: "10107ae5-a0f4-442b-94d7-5cc5afa4b3b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.406783 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.412017 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-kube-api-access-6mkhg" (OuterVolumeSpecName: "kube-api-access-6mkhg") pod "10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" (UID: "10107ae5-a0f4-442b-94d7-5cc5afa4b3b4"). InnerVolumeSpecName "kube-api-access-6mkhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.493640 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" (UID: "10107ae5-a0f4-442b-94d7-5cc5afa4b3b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.511433 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mkhg\" (UniqueName: \"kubernetes.io/projected/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-kube-api-access-6mkhg\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.511479 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.869832 4835 generic.go:334] "Generic (PLEG): container finished" podID="10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" containerID="a672d10598e6c7fa39372188c5d675cfa3a7ac5bd47691de697fd6be2a098ed1" exitCode=0 Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.869883 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5wmk" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.869922 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5wmk" event={"ID":"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4","Type":"ContainerDied","Data":"a672d10598e6c7fa39372188c5d675cfa3a7ac5bd47691de697fd6be2a098ed1"} Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.869958 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5wmk" event={"ID":"10107ae5-a0f4-442b-94d7-5cc5afa4b3b4","Type":"ContainerDied","Data":"c16d5189cd673b228e67519c84d30dff7808f89f3c66431e9c2bbf776c70c8c3"} Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.869984 4835 scope.go:117] "RemoveContainer" containerID="a672d10598e6c7fa39372188c5d675cfa3a7ac5bd47691de697fd6be2a098ed1" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.893772 4835 scope.go:117] "RemoveContainer" containerID="3cbf89a16bed94400f897cbd0cac145351eb61a281557c85a501ea654cc9e4ed" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.910153 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v5wmk"] Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.920273 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v5wmk"] Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.928395 4835 scope.go:117] "RemoveContainer" containerID="383ba1af5df4fd8367cda3da2fc32ba3abbfbb6ee54dbee0811dca51fd9c8758" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.965584 4835 scope.go:117] "RemoveContainer" containerID="a672d10598e6c7fa39372188c5d675cfa3a7ac5bd47691de697fd6be2a098ed1" Oct 03 18:48:36 crc kubenswrapper[4835]: E1003 18:48:36.966000 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a672d10598e6c7fa39372188c5d675cfa3a7ac5bd47691de697fd6be2a098ed1\": container with ID starting with a672d10598e6c7fa39372188c5d675cfa3a7ac5bd47691de697fd6be2a098ed1 not found: ID does not exist" containerID="a672d10598e6c7fa39372188c5d675cfa3a7ac5bd47691de697fd6be2a098ed1" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.966028 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a672d10598e6c7fa39372188c5d675cfa3a7ac5bd47691de697fd6be2a098ed1"} err="failed to get container status \"a672d10598e6c7fa39372188c5d675cfa3a7ac5bd47691de697fd6be2a098ed1\": rpc error: code = NotFound desc = could not find container \"a672d10598e6c7fa39372188c5d675cfa3a7ac5bd47691de697fd6be2a098ed1\": container with ID starting with a672d10598e6c7fa39372188c5d675cfa3a7ac5bd47691de697fd6be2a098ed1 not found: ID does not exist" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.966052 4835 scope.go:117] "RemoveContainer" containerID="3cbf89a16bed94400f897cbd0cac145351eb61a281557c85a501ea654cc9e4ed" Oct 03 18:48:36 crc kubenswrapper[4835]: E1003 18:48:36.966353 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cbf89a16bed94400f897cbd0cac145351eb61a281557c85a501ea654cc9e4ed\": container with ID starting with 3cbf89a16bed94400f897cbd0cac145351eb61a281557c85a501ea654cc9e4ed not found: ID does not exist" containerID="3cbf89a16bed94400f897cbd0cac145351eb61a281557c85a501ea654cc9e4ed" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.966394 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cbf89a16bed94400f897cbd0cac145351eb61a281557c85a501ea654cc9e4ed"} err="failed to get container status \"3cbf89a16bed94400f897cbd0cac145351eb61a281557c85a501ea654cc9e4ed\": rpc error: code = NotFound desc = could not find container \"3cbf89a16bed94400f897cbd0cac145351eb61a281557c85a501ea654cc9e4ed\": container with ID starting with 3cbf89a16bed94400f897cbd0cac145351eb61a281557c85a501ea654cc9e4ed not found: ID does not exist" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.966420 4835 scope.go:117] "RemoveContainer" containerID="383ba1af5df4fd8367cda3da2fc32ba3abbfbb6ee54dbee0811dca51fd9c8758" Oct 03 18:48:36 crc kubenswrapper[4835]: E1003 18:48:36.966738 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383ba1af5df4fd8367cda3da2fc32ba3abbfbb6ee54dbee0811dca51fd9c8758\": container with ID starting with 383ba1af5df4fd8367cda3da2fc32ba3abbfbb6ee54dbee0811dca51fd9c8758 not found: ID does not exist" containerID="383ba1af5df4fd8367cda3da2fc32ba3abbfbb6ee54dbee0811dca51fd9c8758" Oct 03 18:48:36 crc kubenswrapper[4835]: I1003 18:48:36.966790 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383ba1af5df4fd8367cda3da2fc32ba3abbfbb6ee54dbee0811dca51fd9c8758"} err="failed to get container status \"383ba1af5df4fd8367cda3da2fc32ba3abbfbb6ee54dbee0811dca51fd9c8758\": rpc error: code = NotFound desc = could not find container \"383ba1af5df4fd8367cda3da2fc32ba3abbfbb6ee54dbee0811dca51fd9c8758\": container with ID starting with 383ba1af5df4fd8367cda3da2fc32ba3abbfbb6ee54dbee0811dca51fd9c8758 not found: ID does not exist" Oct 03 18:48:38 crc kubenswrapper[4835]: I1003 18:48:38.891774 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" path="/var/lib/kubelet/pods/10107ae5-a0f4-442b-94d7-5cc5afa4b3b4/volumes" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.299269 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9btbn"] Oct 03 18:49:03 crc kubenswrapper[4835]: E1003 18:49:03.300189 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" containerName="registry-server" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.300204 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" containerName="registry-server" Oct 03 18:49:03 crc kubenswrapper[4835]: E1003 18:49:03.300213 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" containerName="extract-utilities" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.300219 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" containerName="extract-utilities" Oct 03 18:49:03 crc kubenswrapper[4835]: E1003 18:49:03.300236 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" containerName="extract-content" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.300241 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" containerName="extract-content" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.300458 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="10107ae5-a0f4-442b-94d7-5cc5afa4b3b4" containerName="registry-server" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.301872 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.325391 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9btbn"] Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.443539 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d77fe34-776e-4bba-bbe7-5a6f7080c661-catalog-content\") pod \"certified-operators-9btbn\" (UID: \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\") " pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.443905 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v68kv\" (UniqueName: \"kubernetes.io/projected/3d77fe34-776e-4bba-bbe7-5a6f7080c661-kube-api-access-v68kv\") pod \"certified-operators-9btbn\" (UID: \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\") " pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.444150 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d77fe34-776e-4bba-bbe7-5a6f7080c661-utilities\") pod \"certified-operators-9btbn\" (UID: \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\") " pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.546357 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d77fe34-776e-4bba-bbe7-5a6f7080c661-utilities\") pod \"certified-operators-9btbn\" (UID: \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\") " pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.546454 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d77fe34-776e-4bba-bbe7-5a6f7080c661-catalog-content\") pod \"certified-operators-9btbn\" (UID: \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\") " pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.546510 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v68kv\" (UniqueName: \"kubernetes.io/projected/3d77fe34-776e-4bba-bbe7-5a6f7080c661-kube-api-access-v68kv\") pod \"certified-operators-9btbn\" (UID: \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\") " pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.546946 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d77fe34-776e-4bba-bbe7-5a6f7080c661-utilities\") pod \"certified-operators-9btbn\" (UID: \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\") " pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.547023 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d77fe34-776e-4bba-bbe7-5a6f7080c661-catalog-content\") pod \"certified-operators-9btbn\" (UID: \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\") " pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.569492 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v68kv\" (UniqueName: \"kubernetes.io/projected/3d77fe34-776e-4bba-bbe7-5a6f7080c661-kube-api-access-v68kv\") pod \"certified-operators-9btbn\" (UID: \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\") " pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:03 crc kubenswrapper[4835]: I1003 18:49:03.630909 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:04 crc kubenswrapper[4835]: I1003 18:49:04.124295 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9btbn"] Oct 03 18:49:04 crc kubenswrapper[4835]: I1003 18:49:04.696452 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8ft8h"] Oct 03 18:49:04 crc kubenswrapper[4835]: I1003 18:49:04.698815 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:04 crc kubenswrapper[4835]: I1003 18:49:04.711727 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ft8h"] Oct 03 18:49:04 crc kubenswrapper[4835]: I1003 18:49:04.768180 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwdqt\" (UniqueName: \"kubernetes.io/projected/77795dd7-c4d7-4293-bf1f-ee075bab33c6-kube-api-access-lwdqt\") pod \"community-operators-8ft8h\" (UID: \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\") " pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:04 crc kubenswrapper[4835]: I1003 18:49:04.768364 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77795dd7-c4d7-4293-bf1f-ee075bab33c6-catalog-content\") pod \"community-operators-8ft8h\" (UID: \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\") " pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:04 crc kubenswrapper[4835]: I1003 18:49:04.768403 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77795dd7-c4d7-4293-bf1f-ee075bab33c6-utilities\") pod \"community-operators-8ft8h\" (UID: \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\") " pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:04 crc kubenswrapper[4835]: I1003 18:49:04.870381 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwdqt\" (UniqueName: \"kubernetes.io/projected/77795dd7-c4d7-4293-bf1f-ee075bab33c6-kube-api-access-lwdqt\") pod \"community-operators-8ft8h\" (UID: \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\") " pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:04 crc kubenswrapper[4835]: I1003 18:49:04.870579 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77795dd7-c4d7-4293-bf1f-ee075bab33c6-utilities\") pod \"community-operators-8ft8h\" (UID: \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\") " pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:04 crc kubenswrapper[4835]: I1003 18:49:04.870606 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77795dd7-c4d7-4293-bf1f-ee075bab33c6-catalog-content\") pod \"community-operators-8ft8h\" (UID: \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\") " pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:04 crc kubenswrapper[4835]: I1003 18:49:04.871247 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77795dd7-c4d7-4293-bf1f-ee075bab33c6-utilities\") pod \"community-operators-8ft8h\" (UID: \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\") " pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:04 crc kubenswrapper[4835]: I1003 18:49:04.871281 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77795dd7-c4d7-4293-bf1f-ee075bab33c6-catalog-content\") pod \"community-operators-8ft8h\" (UID: \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\") " pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:04 crc kubenswrapper[4835]: I1003 18:49:04.900156 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwdqt\" (UniqueName: \"kubernetes.io/projected/77795dd7-c4d7-4293-bf1f-ee075bab33c6-kube-api-access-lwdqt\") pod \"community-operators-8ft8h\" (UID: \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\") " pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:05 crc kubenswrapper[4835]: I1003 18:49:05.032629 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:05 crc kubenswrapper[4835]: I1003 18:49:05.119708 4835 generic.go:334] "Generic (PLEG): container finished" podID="3d77fe34-776e-4bba-bbe7-5a6f7080c661" containerID="9a22ad8ee7b24ea03a623066ff669f0be33f64c9fbbf96ecef85f0aecf0899e2" exitCode=0 Oct 03 18:49:05 crc kubenswrapper[4835]: I1003 18:49:05.119751 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9btbn" event={"ID":"3d77fe34-776e-4bba-bbe7-5a6f7080c661","Type":"ContainerDied","Data":"9a22ad8ee7b24ea03a623066ff669f0be33f64c9fbbf96ecef85f0aecf0899e2"} Oct 03 18:49:05 crc kubenswrapper[4835]: I1003 18:49:05.119798 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9btbn" event={"ID":"3d77fe34-776e-4bba-bbe7-5a6f7080c661","Type":"ContainerStarted","Data":"87897eeaf5939c3185a574c403bb6beaf454349f32e1eca1b43550fe619deb2c"} Oct 03 18:49:05 crc kubenswrapper[4835]: I1003 18:49:05.545554 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8ft8h"] Oct 03 18:49:05 crc kubenswrapper[4835]: W1003 18:49:05.550310 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77795dd7_c4d7_4293_bf1f_ee075bab33c6.slice/crio-e2be47f270b21362a8a717d72c95e5d08390c70ba92321a6a4bd0e7b55441907 WatchSource:0}: Error finding container e2be47f270b21362a8a717d72c95e5d08390c70ba92321a6a4bd0e7b55441907: Status 404 returned error can't find the container with id e2be47f270b21362a8a717d72c95e5d08390c70ba92321a6a4bd0e7b55441907 Oct 03 18:49:06 crc kubenswrapper[4835]: I1003 18:49:06.129062 4835 generic.go:334] "Generic (PLEG): container finished" podID="77795dd7-c4d7-4293-bf1f-ee075bab33c6" containerID="87735626ac7fa1ea05fa23db5f2bf6a0f7c8091842c4cca1d1001b7276ba68ad" exitCode=0 Oct 03 18:49:06 crc kubenswrapper[4835]: I1003 18:49:06.129139 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ft8h" event={"ID":"77795dd7-c4d7-4293-bf1f-ee075bab33c6","Type":"ContainerDied","Data":"87735626ac7fa1ea05fa23db5f2bf6a0f7c8091842c4cca1d1001b7276ba68ad"} Oct 03 18:49:06 crc kubenswrapper[4835]: I1003 18:49:06.129435 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ft8h" event={"ID":"77795dd7-c4d7-4293-bf1f-ee075bab33c6","Type":"ContainerStarted","Data":"e2be47f270b21362a8a717d72c95e5d08390c70ba92321a6a4bd0e7b55441907"} Oct 03 18:49:06 crc kubenswrapper[4835]: I1003 18:49:06.131270 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9btbn" event={"ID":"3d77fe34-776e-4bba-bbe7-5a6f7080c661","Type":"ContainerStarted","Data":"231482f953a32da0108e47777602b2a589a2ea2c17cf9cd04e62669b75fb3e0c"} Oct 03 18:49:07 crc kubenswrapper[4835]: I1003 18:49:07.148841 4835 generic.go:334] "Generic (PLEG): container finished" podID="3d77fe34-776e-4bba-bbe7-5a6f7080c661" containerID="231482f953a32da0108e47777602b2a589a2ea2c17cf9cd04e62669b75fb3e0c" exitCode=0 Oct 03 18:49:07 crc kubenswrapper[4835]: I1003 18:49:07.148949 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9btbn" event={"ID":"3d77fe34-776e-4bba-bbe7-5a6f7080c661","Type":"ContainerDied","Data":"231482f953a32da0108e47777602b2a589a2ea2c17cf9cd04e62669b75fb3e0c"} Oct 03 18:49:08 crc kubenswrapper[4835]: I1003 18:49:08.159859 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9btbn" event={"ID":"3d77fe34-776e-4bba-bbe7-5a6f7080c661","Type":"ContainerStarted","Data":"988732d346cbd553efa0d57dd4f7e277badf5021bf3fdcc209e12438ab7c77fd"} Oct 03 18:49:08 crc kubenswrapper[4835]: I1003 18:49:08.161675 4835 generic.go:334] "Generic (PLEG): container finished" podID="77795dd7-c4d7-4293-bf1f-ee075bab33c6" containerID="8a7c34f2b6cd5f335e9c6214c18aa10d33a673d26ccbae991af803e40fa2f7df" exitCode=0 Oct 03 18:49:08 crc kubenswrapper[4835]: I1003 18:49:08.161704 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ft8h" event={"ID":"77795dd7-c4d7-4293-bf1f-ee075bab33c6","Type":"ContainerDied","Data":"8a7c34f2b6cd5f335e9c6214c18aa10d33a673d26ccbae991af803e40fa2f7df"} Oct 03 18:49:08 crc kubenswrapper[4835]: I1003 18:49:08.177249 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9btbn" podStartSLOduration=2.755518019 podStartE2EDuration="5.177230277s" podCreationTimestamp="2025-10-03 18:49:03 +0000 UTC" firstStartedPulling="2025-10-03 18:49:05.121089548 +0000 UTC m=+2086.837030420" lastFinishedPulling="2025-10-03 18:49:07.542801806 +0000 UTC m=+2089.258742678" observedRunningTime="2025-10-03 18:49:08.176189731 +0000 UTC m=+2089.892130603" watchObservedRunningTime="2025-10-03 18:49:08.177230277 +0000 UTC m=+2089.893171149" Oct 03 18:49:09 crc kubenswrapper[4835]: I1003 18:49:09.173820 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ft8h" event={"ID":"77795dd7-c4d7-4293-bf1f-ee075bab33c6","Type":"ContainerStarted","Data":"cab86a63df9431a0b68ce791275e003efa883c636b88a78b0813c055ccda8a1c"} Oct 03 18:49:09 crc kubenswrapper[4835]: I1003 18:49:09.195454 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8ft8h" podStartSLOduration=2.630546978 podStartE2EDuration="5.195436981s" podCreationTimestamp="2025-10-03 18:49:04 +0000 UTC" firstStartedPulling="2025-10-03 18:49:06.131239602 +0000 UTC m=+2087.847180474" lastFinishedPulling="2025-10-03 18:49:08.696129605 +0000 UTC m=+2090.412070477" observedRunningTime="2025-10-03 18:49:09.188727895 +0000 UTC m=+2090.904668777" watchObservedRunningTime="2025-10-03 18:49:09.195436981 +0000 UTC m=+2090.911377853" Oct 03 18:49:13 crc kubenswrapper[4835]: I1003 18:49:13.631763 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:13 crc kubenswrapper[4835]: I1003 18:49:13.633680 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:13 crc kubenswrapper[4835]: I1003 18:49:13.698654 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:14 crc kubenswrapper[4835]: I1003 18:49:14.262496 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:15 crc kubenswrapper[4835]: I1003 18:49:15.033179 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:15 crc kubenswrapper[4835]: I1003 18:49:15.033281 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:15 crc kubenswrapper[4835]: I1003 18:49:15.089111 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:15 crc kubenswrapper[4835]: I1003 18:49:15.272233 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:15 crc kubenswrapper[4835]: I1003 18:49:15.491588 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9btbn"] Oct 03 18:49:16 crc kubenswrapper[4835]: I1003 18:49:16.236122 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9btbn" podUID="3d77fe34-776e-4bba-bbe7-5a6f7080c661" containerName="registry-server" containerID="cri-o://988732d346cbd553efa0d57dd4f7e277badf5021bf3fdcc209e12438ab7c77fd" gracePeriod=2 Oct 03 18:49:16 crc kubenswrapper[4835]: I1003 18:49:16.289613 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ft8h"] Oct 03 18:49:16 crc kubenswrapper[4835]: I1003 18:49:16.663594 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:16 crc kubenswrapper[4835]: I1003 18:49:16.817421 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d77fe34-776e-4bba-bbe7-5a6f7080c661-catalog-content\") pod \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\" (UID: \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\") " Oct 03 18:49:16 crc kubenswrapper[4835]: I1003 18:49:16.817698 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d77fe34-776e-4bba-bbe7-5a6f7080c661-utilities\") pod \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\" (UID: \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\") " Oct 03 18:49:16 crc kubenswrapper[4835]: I1003 18:49:16.817758 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v68kv\" (UniqueName: \"kubernetes.io/projected/3d77fe34-776e-4bba-bbe7-5a6f7080c661-kube-api-access-v68kv\") pod \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\" (UID: \"3d77fe34-776e-4bba-bbe7-5a6f7080c661\") " Oct 03 18:49:16 crc kubenswrapper[4835]: I1003 18:49:16.818775 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d77fe34-776e-4bba-bbe7-5a6f7080c661-utilities" (OuterVolumeSpecName: "utilities") pod "3d77fe34-776e-4bba-bbe7-5a6f7080c661" (UID: "3d77fe34-776e-4bba-bbe7-5a6f7080c661"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:49:16 crc kubenswrapper[4835]: I1003 18:49:16.828586 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d77fe34-776e-4bba-bbe7-5a6f7080c661-kube-api-access-v68kv" (OuterVolumeSpecName: "kube-api-access-v68kv") pod "3d77fe34-776e-4bba-bbe7-5a6f7080c661" (UID: "3d77fe34-776e-4bba-bbe7-5a6f7080c661"). InnerVolumeSpecName "kube-api-access-v68kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:49:16 crc kubenswrapper[4835]: I1003 18:49:16.862821 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d77fe34-776e-4bba-bbe7-5a6f7080c661-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d77fe34-776e-4bba-bbe7-5a6f7080c661" (UID: "3d77fe34-776e-4bba-bbe7-5a6f7080c661"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:49:16 crc kubenswrapper[4835]: I1003 18:49:16.923297 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d77fe34-776e-4bba-bbe7-5a6f7080c661-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:49:16 crc kubenswrapper[4835]: I1003 18:49:16.923359 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v68kv\" (UniqueName: \"kubernetes.io/projected/3d77fe34-776e-4bba-bbe7-5a6f7080c661-kube-api-access-v68kv\") on node \"crc\" DevicePath \"\"" Oct 03 18:49:16 crc kubenswrapper[4835]: I1003 18:49:16.923374 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d77fe34-776e-4bba-bbe7-5a6f7080c661-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.248736 4835 generic.go:334] "Generic (PLEG): container finished" podID="3d77fe34-776e-4bba-bbe7-5a6f7080c661" containerID="988732d346cbd553efa0d57dd4f7e277badf5021bf3fdcc209e12438ab7c77fd" exitCode=0 Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.248818 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9btbn" Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.248807 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9btbn" event={"ID":"3d77fe34-776e-4bba-bbe7-5a6f7080c661","Type":"ContainerDied","Data":"988732d346cbd553efa0d57dd4f7e277badf5021bf3fdcc209e12438ab7c77fd"} Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.248883 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9btbn" event={"ID":"3d77fe34-776e-4bba-bbe7-5a6f7080c661","Type":"ContainerDied","Data":"87897eeaf5939c3185a574c403bb6beaf454349f32e1eca1b43550fe619deb2c"} Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.248907 4835 scope.go:117] "RemoveContainer" containerID="988732d346cbd553efa0d57dd4f7e277badf5021bf3fdcc209e12438ab7c77fd" Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.272675 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9btbn"] Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.276134 4835 scope.go:117] "RemoveContainer" containerID="231482f953a32da0108e47777602b2a589a2ea2c17cf9cd04e62669b75fb3e0c" Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.281561 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9btbn"] Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.301817 4835 scope.go:117] "RemoveContainer" containerID="9a22ad8ee7b24ea03a623066ff669f0be33f64c9fbbf96ecef85f0aecf0899e2" Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.350434 4835 scope.go:117] "RemoveContainer" containerID="988732d346cbd553efa0d57dd4f7e277badf5021bf3fdcc209e12438ab7c77fd" Oct 03 18:49:17 crc kubenswrapper[4835]: E1003 18:49:17.351177 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"988732d346cbd553efa0d57dd4f7e277badf5021bf3fdcc209e12438ab7c77fd\": container with ID starting with 988732d346cbd553efa0d57dd4f7e277badf5021bf3fdcc209e12438ab7c77fd not found: ID does not exist" containerID="988732d346cbd553efa0d57dd4f7e277badf5021bf3fdcc209e12438ab7c77fd" Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.351255 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"988732d346cbd553efa0d57dd4f7e277badf5021bf3fdcc209e12438ab7c77fd"} err="failed to get container status \"988732d346cbd553efa0d57dd4f7e277badf5021bf3fdcc209e12438ab7c77fd\": rpc error: code = NotFound desc = could not find container \"988732d346cbd553efa0d57dd4f7e277badf5021bf3fdcc209e12438ab7c77fd\": container with ID starting with 988732d346cbd553efa0d57dd4f7e277badf5021bf3fdcc209e12438ab7c77fd not found: ID does not exist" Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.351296 4835 scope.go:117] "RemoveContainer" containerID="231482f953a32da0108e47777602b2a589a2ea2c17cf9cd04e62669b75fb3e0c" Oct 03 18:49:17 crc kubenswrapper[4835]: E1003 18:49:17.351779 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"231482f953a32da0108e47777602b2a589a2ea2c17cf9cd04e62669b75fb3e0c\": container with ID starting with 231482f953a32da0108e47777602b2a589a2ea2c17cf9cd04e62669b75fb3e0c not found: ID does not exist" containerID="231482f953a32da0108e47777602b2a589a2ea2c17cf9cd04e62669b75fb3e0c" Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.351830 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"231482f953a32da0108e47777602b2a589a2ea2c17cf9cd04e62669b75fb3e0c"} err="failed to get container status \"231482f953a32da0108e47777602b2a589a2ea2c17cf9cd04e62669b75fb3e0c\": rpc error: code = NotFound desc = could not find container \"231482f953a32da0108e47777602b2a589a2ea2c17cf9cd04e62669b75fb3e0c\": container with ID starting with 231482f953a32da0108e47777602b2a589a2ea2c17cf9cd04e62669b75fb3e0c not found: ID does not exist" Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.351866 4835 scope.go:117] "RemoveContainer" containerID="9a22ad8ee7b24ea03a623066ff669f0be33f64c9fbbf96ecef85f0aecf0899e2" Oct 03 18:49:17 crc kubenswrapper[4835]: E1003 18:49:17.352252 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a22ad8ee7b24ea03a623066ff669f0be33f64c9fbbf96ecef85f0aecf0899e2\": container with ID starting with 9a22ad8ee7b24ea03a623066ff669f0be33f64c9fbbf96ecef85f0aecf0899e2 not found: ID does not exist" containerID="9a22ad8ee7b24ea03a623066ff669f0be33f64c9fbbf96ecef85f0aecf0899e2" Oct 03 18:49:17 crc kubenswrapper[4835]: I1003 18:49:17.352295 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a22ad8ee7b24ea03a623066ff669f0be33f64c9fbbf96ecef85f0aecf0899e2"} err="failed to get container status \"9a22ad8ee7b24ea03a623066ff669f0be33f64c9fbbf96ecef85f0aecf0899e2\": rpc error: code = NotFound desc = could not find container \"9a22ad8ee7b24ea03a623066ff669f0be33f64c9fbbf96ecef85f0aecf0899e2\": container with ID starting with 9a22ad8ee7b24ea03a623066ff669f0be33f64c9fbbf96ecef85f0aecf0899e2 not found: ID does not exist" Oct 03 18:49:18 crc kubenswrapper[4835]: I1003 18:49:18.259496 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8ft8h" podUID="77795dd7-c4d7-4293-bf1f-ee075bab33c6" containerName="registry-server" containerID="cri-o://cab86a63df9431a0b68ce791275e003efa883c636b88a78b0813c055ccda8a1c" gracePeriod=2 Oct 03 18:49:18 crc kubenswrapper[4835]: I1003 18:49:18.724174 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:18 crc kubenswrapper[4835]: I1003 18:49:18.864710 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwdqt\" (UniqueName: \"kubernetes.io/projected/77795dd7-c4d7-4293-bf1f-ee075bab33c6-kube-api-access-lwdqt\") pod \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\" (UID: \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\") " Oct 03 18:49:18 crc kubenswrapper[4835]: I1003 18:49:18.865002 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77795dd7-c4d7-4293-bf1f-ee075bab33c6-catalog-content\") pod \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\" (UID: \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\") " Oct 03 18:49:18 crc kubenswrapper[4835]: I1003 18:49:18.865095 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77795dd7-c4d7-4293-bf1f-ee075bab33c6-utilities\") pod \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\" (UID: \"77795dd7-c4d7-4293-bf1f-ee075bab33c6\") " Oct 03 18:49:18 crc kubenswrapper[4835]: I1003 18:49:18.866428 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77795dd7-c4d7-4293-bf1f-ee075bab33c6-utilities" (OuterVolumeSpecName: "utilities") pod "77795dd7-c4d7-4293-bf1f-ee075bab33c6" (UID: "77795dd7-c4d7-4293-bf1f-ee075bab33c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:49:18 crc kubenswrapper[4835]: I1003 18:49:18.876508 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77795dd7-c4d7-4293-bf1f-ee075bab33c6-kube-api-access-lwdqt" (OuterVolumeSpecName: "kube-api-access-lwdqt") pod "77795dd7-c4d7-4293-bf1f-ee075bab33c6" (UID: "77795dd7-c4d7-4293-bf1f-ee075bab33c6"). InnerVolumeSpecName "kube-api-access-lwdqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:49:18 crc kubenswrapper[4835]: I1003 18:49:18.889556 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d77fe34-776e-4bba-bbe7-5a6f7080c661" path="/var/lib/kubelet/pods/3d77fe34-776e-4bba-bbe7-5a6f7080c661/volumes" Oct 03 18:49:18 crc kubenswrapper[4835]: I1003 18:49:18.923788 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77795dd7-c4d7-4293-bf1f-ee075bab33c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77795dd7-c4d7-4293-bf1f-ee075bab33c6" (UID: "77795dd7-c4d7-4293-bf1f-ee075bab33c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:49:18 crc kubenswrapper[4835]: I1003 18:49:18.967557 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77795dd7-c4d7-4293-bf1f-ee075bab33c6-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:49:18 crc kubenswrapper[4835]: I1003 18:49:18.967816 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwdqt\" (UniqueName: \"kubernetes.io/projected/77795dd7-c4d7-4293-bf1f-ee075bab33c6-kube-api-access-lwdqt\") on node \"crc\" DevicePath \"\"" Oct 03 18:49:18 crc kubenswrapper[4835]: I1003 18:49:18.967828 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77795dd7-c4d7-4293-bf1f-ee075bab33c6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.273622 4835 generic.go:334] "Generic (PLEG): container finished" podID="77795dd7-c4d7-4293-bf1f-ee075bab33c6" containerID="cab86a63df9431a0b68ce791275e003efa883c636b88a78b0813c055ccda8a1c" exitCode=0 Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.275005 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ft8h" event={"ID":"77795dd7-c4d7-4293-bf1f-ee075bab33c6","Type":"ContainerDied","Data":"cab86a63df9431a0b68ce791275e003efa883c636b88a78b0813c055ccda8a1c"} Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.275201 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8ft8h" event={"ID":"77795dd7-c4d7-4293-bf1f-ee075bab33c6","Type":"ContainerDied","Data":"e2be47f270b21362a8a717d72c95e5d08390c70ba92321a6a4bd0e7b55441907"} Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.275232 4835 scope.go:117] "RemoveContainer" containerID="cab86a63df9431a0b68ce791275e003efa883c636b88a78b0813c055ccda8a1c" Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.279015 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8ft8h" Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.315212 4835 scope.go:117] "RemoveContainer" containerID="8a7c34f2b6cd5f335e9c6214c18aa10d33a673d26ccbae991af803e40fa2f7df" Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.321947 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8ft8h"] Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.331554 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8ft8h"] Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.354041 4835 scope.go:117] "RemoveContainer" containerID="87735626ac7fa1ea05fa23db5f2bf6a0f7c8091842c4cca1d1001b7276ba68ad" Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.380562 4835 scope.go:117] "RemoveContainer" containerID="cab86a63df9431a0b68ce791275e003efa883c636b88a78b0813c055ccda8a1c" Oct 03 18:49:19 crc kubenswrapper[4835]: E1003 18:49:19.381194 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cab86a63df9431a0b68ce791275e003efa883c636b88a78b0813c055ccda8a1c\": container with ID starting with cab86a63df9431a0b68ce791275e003efa883c636b88a78b0813c055ccda8a1c not found: ID does not exist" containerID="cab86a63df9431a0b68ce791275e003efa883c636b88a78b0813c055ccda8a1c" Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.381259 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cab86a63df9431a0b68ce791275e003efa883c636b88a78b0813c055ccda8a1c"} err="failed to get container status \"cab86a63df9431a0b68ce791275e003efa883c636b88a78b0813c055ccda8a1c\": rpc error: code = NotFound desc = could not find container \"cab86a63df9431a0b68ce791275e003efa883c636b88a78b0813c055ccda8a1c\": container with ID starting with cab86a63df9431a0b68ce791275e003efa883c636b88a78b0813c055ccda8a1c not found: ID does not exist" Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.381291 4835 scope.go:117] "RemoveContainer" containerID="8a7c34f2b6cd5f335e9c6214c18aa10d33a673d26ccbae991af803e40fa2f7df" Oct 03 18:49:19 crc kubenswrapper[4835]: E1003 18:49:19.381739 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a7c34f2b6cd5f335e9c6214c18aa10d33a673d26ccbae991af803e40fa2f7df\": container with ID starting with 8a7c34f2b6cd5f335e9c6214c18aa10d33a673d26ccbae991af803e40fa2f7df not found: ID does not exist" containerID="8a7c34f2b6cd5f335e9c6214c18aa10d33a673d26ccbae991af803e40fa2f7df" Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.381780 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a7c34f2b6cd5f335e9c6214c18aa10d33a673d26ccbae991af803e40fa2f7df"} err="failed to get container status \"8a7c34f2b6cd5f335e9c6214c18aa10d33a673d26ccbae991af803e40fa2f7df\": rpc error: code = NotFound desc = could not find container \"8a7c34f2b6cd5f335e9c6214c18aa10d33a673d26ccbae991af803e40fa2f7df\": container with ID starting with 8a7c34f2b6cd5f335e9c6214c18aa10d33a673d26ccbae991af803e40fa2f7df not found: ID does not exist" Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.381793 4835 scope.go:117] "RemoveContainer" containerID="87735626ac7fa1ea05fa23db5f2bf6a0f7c8091842c4cca1d1001b7276ba68ad" Oct 03 18:49:19 crc kubenswrapper[4835]: E1003 18:49:19.382206 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87735626ac7fa1ea05fa23db5f2bf6a0f7c8091842c4cca1d1001b7276ba68ad\": container with ID starting with 87735626ac7fa1ea05fa23db5f2bf6a0f7c8091842c4cca1d1001b7276ba68ad not found: ID does not exist" containerID="87735626ac7fa1ea05fa23db5f2bf6a0f7c8091842c4cca1d1001b7276ba68ad" Oct 03 18:49:19 crc kubenswrapper[4835]: I1003 18:49:19.382247 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87735626ac7fa1ea05fa23db5f2bf6a0f7c8091842c4cca1d1001b7276ba68ad"} err="failed to get container status \"87735626ac7fa1ea05fa23db5f2bf6a0f7c8091842c4cca1d1001b7276ba68ad\": rpc error: code = NotFound desc = could not find container \"87735626ac7fa1ea05fa23db5f2bf6a0f7c8091842c4cca1d1001b7276ba68ad\": container with ID starting with 87735626ac7fa1ea05fa23db5f2bf6a0f7c8091842c4cca1d1001b7276ba68ad not found: ID does not exist" Oct 03 18:49:20 crc kubenswrapper[4835]: I1003 18:49:20.283298 4835 generic.go:334] "Generic (PLEG): container finished" podID="f0d5e8ab-25a4-4213-9bb4-1e41116eab53" containerID="459b857c60eec13f8b021d4fe33ef8f6424df681b85332e7a49ffc5e3f9b4bbc" exitCode=0 Oct 03 18:49:20 crc kubenswrapper[4835]: I1003 18:49:20.283393 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" event={"ID":"f0d5e8ab-25a4-4213-9bb4-1e41116eab53","Type":"ContainerDied","Data":"459b857c60eec13f8b021d4fe33ef8f6424df681b85332e7a49ffc5e3f9b4bbc"} Oct 03 18:49:20 crc kubenswrapper[4835]: I1003 18:49:20.887805 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77795dd7-c4d7-4293-bf1f-ee075bab33c6" path="/var/lib/kubelet/pods/77795dd7-c4d7-4293-bf1f-ee075bab33c6/volumes" Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.665759 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.729730 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rk7r\" (UniqueName: \"kubernetes.io/projected/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-kube-api-access-5rk7r\") pod \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.730313 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ovn-combined-ca-bundle\") pod \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.730460 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-inventory\") pod \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.730629 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ovncontroller-config-0\") pod \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.731477 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ssh-key\") pod \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\" (UID: \"f0d5e8ab-25a4-4213-9bb4-1e41116eab53\") " Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.735986 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f0d5e8ab-25a4-4213-9bb4-1e41116eab53" (UID: "f0d5e8ab-25a4-4213-9bb4-1e41116eab53"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.736476 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-kube-api-access-5rk7r" (OuterVolumeSpecName: "kube-api-access-5rk7r") pod "f0d5e8ab-25a4-4213-9bb4-1e41116eab53" (UID: "f0d5e8ab-25a4-4213-9bb4-1e41116eab53"). InnerVolumeSpecName "kube-api-access-5rk7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.755673 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "f0d5e8ab-25a4-4213-9bb4-1e41116eab53" (UID: "f0d5e8ab-25a4-4213-9bb4-1e41116eab53"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.758986 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-inventory" (OuterVolumeSpecName: "inventory") pod "f0d5e8ab-25a4-4213-9bb4-1e41116eab53" (UID: "f0d5e8ab-25a4-4213-9bb4-1e41116eab53"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.760482 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f0d5e8ab-25a4-4213-9bb4-1e41116eab53" (UID: "f0d5e8ab-25a4-4213-9bb4-1e41116eab53"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.833588 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.833624 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.833638 4835 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.833651 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:49:21 crc kubenswrapper[4835]: I1003 18:49:21.833661 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rk7r\" (UniqueName: \"kubernetes.io/projected/f0d5e8ab-25a4-4213-9bb4-1e41116eab53-kube-api-access-5rk7r\") on node \"crc\" DevicePath \"\"" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.303846 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" event={"ID":"f0d5e8ab-25a4-4213-9bb4-1e41116eab53","Type":"ContainerDied","Data":"75596db96a571b1399a453ba181df8f4b2159bc41c03fd1ef1967e9ea94d2b2b"} Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.304290 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75596db96a571b1399a453ba181df8f4b2159bc41c03fd1ef1967e9ea94d2b2b" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.303887 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jq6zq" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.386709 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp"] Oct 03 18:49:22 crc kubenswrapper[4835]: E1003 18:49:22.387176 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d77fe34-776e-4bba-bbe7-5a6f7080c661" containerName="extract-content" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.387193 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d77fe34-776e-4bba-bbe7-5a6f7080c661" containerName="extract-content" Oct 03 18:49:22 crc kubenswrapper[4835]: E1003 18:49:22.387209 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d5e8ab-25a4-4213-9bb4-1e41116eab53" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.387216 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d5e8ab-25a4-4213-9bb4-1e41116eab53" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 18:49:22 crc kubenswrapper[4835]: E1003 18:49:22.387231 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77795dd7-c4d7-4293-bf1f-ee075bab33c6" containerName="extract-utilities" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.387238 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="77795dd7-c4d7-4293-bf1f-ee075bab33c6" containerName="extract-utilities" Oct 03 18:49:22 crc kubenswrapper[4835]: E1003 18:49:22.387247 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77795dd7-c4d7-4293-bf1f-ee075bab33c6" containerName="registry-server" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.387252 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="77795dd7-c4d7-4293-bf1f-ee075bab33c6" containerName="registry-server" Oct 03 18:49:22 crc kubenswrapper[4835]: E1003 18:49:22.387272 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d77fe34-776e-4bba-bbe7-5a6f7080c661" containerName="registry-server" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.387278 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d77fe34-776e-4bba-bbe7-5a6f7080c661" containerName="registry-server" Oct 03 18:49:22 crc kubenswrapper[4835]: E1003 18:49:22.387293 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77795dd7-c4d7-4293-bf1f-ee075bab33c6" containerName="extract-content" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.387298 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="77795dd7-c4d7-4293-bf1f-ee075bab33c6" containerName="extract-content" Oct 03 18:49:22 crc kubenswrapper[4835]: E1003 18:49:22.387311 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d77fe34-776e-4bba-bbe7-5a6f7080c661" containerName="extract-utilities" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.387316 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d77fe34-776e-4bba-bbe7-5a6f7080c661" containerName="extract-utilities" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.387493 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d5e8ab-25a4-4213-9bb4-1e41116eab53" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.387504 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d77fe34-776e-4bba-bbe7-5a6f7080c661" containerName="registry-server" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.387523 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="77795dd7-c4d7-4293-bf1f-ee075bab33c6" containerName="registry-server" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.388193 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.390780 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.390843 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.390797 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.391247 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.392111 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.395504 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.399014 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp"] Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.443815 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.443956 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.444209 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.444381 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nkt4\" (UniqueName: \"kubernetes.io/projected/440c7001-e0d5-4840-8852-3b1b59285550-kube-api-access-7nkt4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.444470 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.444619 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.546158 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.546212 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.546249 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nkt4\" (UniqueName: \"kubernetes.io/projected/440c7001-e0d5-4840-8852-3b1b59285550-kube-api-access-7nkt4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.546277 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.546327 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.546405 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.550631 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.550688 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.551014 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.551117 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.551156 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.563561 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nkt4\" (UniqueName: \"kubernetes.io/projected/440c7001-e0d5-4840-8852-3b1b59285550-kube-api-access-7nkt4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:22 crc kubenswrapper[4835]: I1003 18:49:22.703695 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:49:23 crc kubenswrapper[4835]: I1003 18:49:23.223681 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp"] Oct 03 18:49:23 crc kubenswrapper[4835]: W1003 18:49:23.228224 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod440c7001_e0d5_4840_8852_3b1b59285550.slice/crio-01688488a33b8b7103e24f58b36a14b63b31fe1cd6c078e42f112c601952e137 WatchSource:0}: Error finding container 01688488a33b8b7103e24f58b36a14b63b31fe1cd6c078e42f112c601952e137: Status 404 returned error can't find the container with id 01688488a33b8b7103e24f58b36a14b63b31fe1cd6c078e42f112c601952e137 Oct 03 18:49:23 crc kubenswrapper[4835]: I1003 18:49:23.313977 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" event={"ID":"440c7001-e0d5-4840-8852-3b1b59285550","Type":"ContainerStarted","Data":"01688488a33b8b7103e24f58b36a14b63b31fe1cd6c078e42f112c601952e137"} Oct 03 18:49:24 crc kubenswrapper[4835]: I1003 18:49:24.323770 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" event={"ID":"440c7001-e0d5-4840-8852-3b1b59285550","Type":"ContainerStarted","Data":"9dfc14da0478f8358108a1afa466d5a94812c80c8cd578df1584e80eb28a03e5"} Oct 03 18:49:24 crc kubenswrapper[4835]: I1003 18:49:24.339774 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" podStartSLOduration=1.8305270569999998 podStartE2EDuration="2.339753807s" podCreationTimestamp="2025-10-03 18:49:22 +0000 UTC" firstStartedPulling="2025-10-03 18:49:23.230612519 +0000 UTC m=+2104.946553391" lastFinishedPulling="2025-10-03 18:49:23.739839269 +0000 UTC m=+2105.455780141" observedRunningTime="2025-10-03 18:49:24.337318137 +0000 UTC m=+2106.053259019" watchObservedRunningTime="2025-10-03 18:49:24.339753807 +0000 UTC m=+2106.055694679" Oct 03 18:50:19 crc kubenswrapper[4835]: I1003 18:50:19.839358 4835 generic.go:334] "Generic (PLEG): container finished" podID="440c7001-e0d5-4840-8852-3b1b59285550" containerID="9dfc14da0478f8358108a1afa466d5a94812c80c8cd578df1584e80eb28a03e5" exitCode=0 Oct 03 18:50:19 crc kubenswrapper[4835]: I1003 18:50:19.839434 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" event={"ID":"440c7001-e0d5-4840-8852-3b1b59285550","Type":"ContainerDied","Data":"9dfc14da0478f8358108a1afa466d5a94812c80c8cd578df1584e80eb28a03e5"} Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.280998 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.391793 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-ssh-key\") pod \"440c7001-e0d5-4840-8852-3b1b59285550\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.392383 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-neutron-metadata-combined-ca-bundle\") pod \"440c7001-e0d5-4840-8852-3b1b59285550\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.392444 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-neutron-ovn-metadata-agent-neutron-config-0\") pod \"440c7001-e0d5-4840-8852-3b1b59285550\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.392471 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-nova-metadata-neutron-config-0\") pod \"440c7001-e0d5-4840-8852-3b1b59285550\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.392571 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-inventory\") pod \"440c7001-e0d5-4840-8852-3b1b59285550\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.392589 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nkt4\" (UniqueName: \"kubernetes.io/projected/440c7001-e0d5-4840-8852-3b1b59285550-kube-api-access-7nkt4\") pod \"440c7001-e0d5-4840-8852-3b1b59285550\" (UID: \"440c7001-e0d5-4840-8852-3b1b59285550\") " Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.398237 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "440c7001-e0d5-4840-8852-3b1b59285550" (UID: "440c7001-e0d5-4840-8852-3b1b59285550"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.404180 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/440c7001-e0d5-4840-8852-3b1b59285550-kube-api-access-7nkt4" (OuterVolumeSpecName: "kube-api-access-7nkt4") pod "440c7001-e0d5-4840-8852-3b1b59285550" (UID: "440c7001-e0d5-4840-8852-3b1b59285550"). InnerVolumeSpecName "kube-api-access-7nkt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.421956 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-inventory" (OuterVolumeSpecName: "inventory") pod "440c7001-e0d5-4840-8852-3b1b59285550" (UID: "440c7001-e0d5-4840-8852-3b1b59285550"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.424307 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "440c7001-e0d5-4840-8852-3b1b59285550" (UID: "440c7001-e0d5-4840-8852-3b1b59285550"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.424701 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "440c7001-e0d5-4840-8852-3b1b59285550" (UID: "440c7001-e0d5-4840-8852-3b1b59285550"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.432031 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "440c7001-e0d5-4840-8852-3b1b59285550" (UID: "440c7001-e0d5-4840-8852-3b1b59285550"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.495285 4835 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.495323 4835 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.495338 4835 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.495353 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.495365 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nkt4\" (UniqueName: \"kubernetes.io/projected/440c7001-e0d5-4840-8852-3b1b59285550-kube-api-access-7nkt4\") on node \"crc\" DevicePath \"\"" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.495378 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/440c7001-e0d5-4840-8852-3b1b59285550-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.856090 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" event={"ID":"440c7001-e0d5-4840-8852-3b1b59285550","Type":"ContainerDied","Data":"01688488a33b8b7103e24f58b36a14b63b31fe1cd6c078e42f112c601952e137"} Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.856128 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.856136 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01688488a33b8b7103e24f58b36a14b63b31fe1cd6c078e42f112c601952e137" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.950914 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng"] Oct 03 18:50:21 crc kubenswrapper[4835]: E1003 18:50:21.951378 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440c7001-e0d5-4840-8852-3b1b59285550" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.951441 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="440c7001-e0d5-4840-8852-3b1b59285550" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.951711 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="440c7001-e0d5-4840-8852-3b1b59285550" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.952529 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.959242 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.959328 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.959333 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.959573 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.959689 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:50:21 crc kubenswrapper[4835]: I1003 18:50:21.964581 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng"] Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.003941 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.004150 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.004239 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgpvb\" (UniqueName: \"kubernetes.io/projected/f6fd81f4-842f-4628-8044-45b76f848087-kube-api-access-kgpvb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.004310 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.004596 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.106541 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.106937 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.107010 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.107182 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.107234 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgpvb\" (UniqueName: \"kubernetes.io/projected/f6fd81f4-842f-4628-8044-45b76f848087-kube-api-access-kgpvb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.111698 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.111879 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.115438 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.116693 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.124737 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgpvb\" (UniqueName: \"kubernetes.io/projected/f6fd81f4-842f-4628-8044-45b76f848087-kube-api-access-kgpvb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-99fng\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.277064 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.807709 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng"] Oct 03 18:50:22 crc kubenswrapper[4835]: I1003 18:50:22.864900 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" event={"ID":"f6fd81f4-842f-4628-8044-45b76f848087","Type":"ContainerStarted","Data":"b2d640f9752a39a3eb6f4f4f7dadd2e9d539e334382267d17c6b64ca82053047"} Oct 03 18:50:23 crc kubenswrapper[4835]: I1003 18:50:23.874834 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" event={"ID":"f6fd81f4-842f-4628-8044-45b76f848087","Type":"ContainerStarted","Data":"48f0567f1affde4ac925e6013a8c1c83f119b512d4a88a0325c39dfbb701b22d"} Oct 03 18:50:23 crc kubenswrapper[4835]: I1003 18:50:23.922493 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" podStartSLOduration=2.297936198 podStartE2EDuration="2.922464629s" podCreationTimestamp="2025-10-03 18:50:21 +0000 UTC" firstStartedPulling="2025-10-03 18:50:22.81327575 +0000 UTC m=+2164.529216622" lastFinishedPulling="2025-10-03 18:50:23.437804181 +0000 UTC m=+2165.153745053" observedRunningTime="2025-10-03 18:50:23.909475277 +0000 UTC m=+2165.625416159" watchObservedRunningTime="2025-10-03 18:50:23.922464629 +0000 UTC m=+2165.638405501" Oct 03 18:50:35 crc kubenswrapper[4835]: I1003 18:50:35.358387 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:50:35 crc kubenswrapper[4835]: I1003 18:50:35.359260 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:51:05 crc kubenswrapper[4835]: I1003 18:51:05.358584 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:51:05 crc kubenswrapper[4835]: I1003 18:51:05.359191 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:51:35 crc kubenswrapper[4835]: I1003 18:51:35.362201 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:51:35 crc kubenswrapper[4835]: I1003 18:51:35.362728 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:51:35 crc kubenswrapper[4835]: I1003 18:51:35.362774 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 18:51:35 crc kubenswrapper[4835]: I1003 18:51:35.363528 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 18:51:35 crc kubenswrapper[4835]: I1003 18:51:35.363581 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" gracePeriod=600 Oct 03 18:51:35 crc kubenswrapper[4835]: E1003 18:51:35.495706 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:51:36 crc kubenswrapper[4835]: I1003 18:51:36.499788 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" exitCode=0 Oct 03 18:51:36 crc kubenswrapper[4835]: I1003 18:51:36.499870 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc"} Oct 03 18:51:36 crc kubenswrapper[4835]: I1003 18:51:36.500216 4835 scope.go:117] "RemoveContainer" containerID="5deb0cf9b1410690f101cc5c84ba400be4c91a67e6bec74de8589a872a3c0d30" Oct 03 18:51:36 crc kubenswrapper[4835]: I1003 18:51:36.500887 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:51:36 crc kubenswrapper[4835]: E1003 18:51:36.501153 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:51:46 crc kubenswrapper[4835]: I1003 18:51:46.876840 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:51:46 crc kubenswrapper[4835]: E1003 18:51:46.877654 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:52:01 crc kubenswrapper[4835]: I1003 18:52:01.877295 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:52:01 crc kubenswrapper[4835]: E1003 18:52:01.878052 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:52:12 crc kubenswrapper[4835]: I1003 18:52:12.877466 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:52:12 crc kubenswrapper[4835]: E1003 18:52:12.879010 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:52:26 crc kubenswrapper[4835]: I1003 18:52:26.876959 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:52:26 crc kubenswrapper[4835]: E1003 18:52:26.877720 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:52:39 crc kubenswrapper[4835]: I1003 18:52:39.878413 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:52:39 crc kubenswrapper[4835]: E1003 18:52:39.879613 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:52:53 crc kubenswrapper[4835]: I1003 18:52:53.876602 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:52:53 crc kubenswrapper[4835]: E1003 18:52:53.877292 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:53:05 crc kubenswrapper[4835]: I1003 18:53:05.877428 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:53:05 crc kubenswrapper[4835]: E1003 18:53:05.878826 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:53:18 crc kubenswrapper[4835]: I1003 18:53:18.883033 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:53:18 crc kubenswrapper[4835]: E1003 18:53:18.883866 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:53:31 crc kubenswrapper[4835]: I1003 18:53:31.877104 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:53:31 crc kubenswrapper[4835]: E1003 18:53:31.877836 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:53:46 crc kubenswrapper[4835]: I1003 18:53:46.877678 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:53:46 crc kubenswrapper[4835]: E1003 18:53:46.878775 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:54:01 crc kubenswrapper[4835]: I1003 18:54:01.876838 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:54:01 crc kubenswrapper[4835]: E1003 18:54:01.877715 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:54:15 crc kubenswrapper[4835]: I1003 18:54:15.877738 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:54:15 crc kubenswrapper[4835]: E1003 18:54:15.879492 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:54:27 crc kubenswrapper[4835]: I1003 18:54:27.877671 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:54:27 crc kubenswrapper[4835]: E1003 18:54:27.878426 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:54:40 crc kubenswrapper[4835]: I1003 18:54:40.876905 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:54:40 crc kubenswrapper[4835]: E1003 18:54:40.877751 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:54:55 crc kubenswrapper[4835]: I1003 18:54:55.878682 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:54:55 crc kubenswrapper[4835]: E1003 18:54:55.880018 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:55:08 crc kubenswrapper[4835]: I1003 18:55:08.883278 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:55:08 crc kubenswrapper[4835]: E1003 18:55:08.883918 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:55:16 crc kubenswrapper[4835]: I1003 18:55:16.568548 4835 generic.go:334] "Generic (PLEG): container finished" podID="f6fd81f4-842f-4628-8044-45b76f848087" containerID="48f0567f1affde4ac925e6013a8c1c83f119b512d4a88a0325c39dfbb701b22d" exitCode=0 Oct 03 18:55:16 crc kubenswrapper[4835]: I1003 18:55:16.568648 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" event={"ID":"f6fd81f4-842f-4628-8044-45b76f848087","Type":"ContainerDied","Data":"48f0567f1affde4ac925e6013a8c1c83f119b512d4a88a0325c39dfbb701b22d"} Oct 03 18:55:17 crc kubenswrapper[4835]: I1003 18:55:17.982769 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.093082 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-libvirt-combined-ca-bundle\") pod \"f6fd81f4-842f-4628-8044-45b76f848087\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.093295 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-libvirt-secret-0\") pod \"f6fd81f4-842f-4628-8044-45b76f848087\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.093361 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-ssh-key\") pod \"f6fd81f4-842f-4628-8044-45b76f848087\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.093394 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-inventory\") pod \"f6fd81f4-842f-4628-8044-45b76f848087\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.093496 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgpvb\" (UniqueName: \"kubernetes.io/projected/f6fd81f4-842f-4628-8044-45b76f848087-kube-api-access-kgpvb\") pod \"f6fd81f4-842f-4628-8044-45b76f848087\" (UID: \"f6fd81f4-842f-4628-8044-45b76f848087\") " Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.101356 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f6fd81f4-842f-4628-8044-45b76f848087" (UID: "f6fd81f4-842f-4628-8044-45b76f848087"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.102645 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6fd81f4-842f-4628-8044-45b76f848087-kube-api-access-kgpvb" (OuterVolumeSpecName: "kube-api-access-kgpvb") pod "f6fd81f4-842f-4628-8044-45b76f848087" (UID: "f6fd81f4-842f-4628-8044-45b76f848087"). InnerVolumeSpecName "kube-api-access-kgpvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.125490 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f6fd81f4-842f-4628-8044-45b76f848087" (UID: "f6fd81f4-842f-4628-8044-45b76f848087"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.130033 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-inventory" (OuterVolumeSpecName: "inventory") pod "f6fd81f4-842f-4628-8044-45b76f848087" (UID: "f6fd81f4-842f-4628-8044-45b76f848087"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.130338 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "f6fd81f4-842f-4628-8044-45b76f848087" (UID: "f6fd81f4-842f-4628-8044-45b76f848087"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.196051 4835 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.196090 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.196099 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.196107 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgpvb\" (UniqueName: \"kubernetes.io/projected/f6fd81f4-842f-4628-8044-45b76f848087-kube-api-access-kgpvb\") on node \"crc\" DevicePath \"\"" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.196119 4835 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6fd81f4-842f-4628-8044-45b76f848087-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.584234 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" event={"ID":"f6fd81f4-842f-4628-8044-45b76f848087","Type":"ContainerDied","Data":"b2d640f9752a39a3eb6f4f4f7dadd2e9d539e334382267d17c6b64ca82053047"} Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.584280 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2d640f9752a39a3eb6f4f4f7dadd2e9d539e334382267d17c6b64ca82053047" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.584599 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-99fng" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.685820 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb"] Oct 03 18:55:18 crc kubenswrapper[4835]: E1003 18:55:18.686383 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6fd81f4-842f-4628-8044-45b76f848087" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.686404 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fd81f4-842f-4628-8044-45b76f848087" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.686658 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6fd81f4-842f-4628-8044-45b76f848087" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.687427 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.689431 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.689514 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.689641 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.690034 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.690175 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.691552 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.692544 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.698245 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb"] Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.808355 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.808643 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.808673 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.808755 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.808779 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl2qx\" (UniqueName: \"kubernetes.io/projected/55d0501b-c32f-4bf7-b52f-e5b941d49926-kube-api-access-sl2qx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.808833 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.808972 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.809015 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.809215 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.910394 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.910446 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.910470 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.910529 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.910564 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.910641 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.910659 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.910677 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.910697 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl2qx\" (UniqueName: \"kubernetes.io/projected/55d0501b-c32f-4bf7-b52f-e5b941d49926-kube-api-access-sl2qx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.912188 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.912934 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.912938 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.913022 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.913167 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.917410 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.922497 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.924004 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.924133 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.924794 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.924902 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.925407 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.926380 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:18 crc kubenswrapper[4835]: I1003 18:55:18.928117 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl2qx\" (UniqueName: \"kubernetes.io/projected/55d0501b-c32f-4bf7-b52f-e5b941d49926-kube-api-access-sl2qx\") pod \"nova-edpm-deployment-openstack-edpm-ipam-h27nb\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:19 crc kubenswrapper[4835]: I1003 18:55:19.005872 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:55:19 crc kubenswrapper[4835]: I1003 18:55:19.013699 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:55:19 crc kubenswrapper[4835]: I1003 18:55:19.569461 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb"] Oct 03 18:55:19 crc kubenswrapper[4835]: I1003 18:55:19.578520 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 18:55:19 crc kubenswrapper[4835]: I1003 18:55:19.594699 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" event={"ID":"55d0501b-c32f-4bf7-b52f-e5b941d49926","Type":"ContainerStarted","Data":"5796d60f70fdb98e447f8388a2236f17578069ee8fc181b35ca95efe78913df9"} Oct 03 18:55:20 crc kubenswrapper[4835]: I1003 18:55:20.364843 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:55:21 crc kubenswrapper[4835]: I1003 18:55:21.614583 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" event={"ID":"55d0501b-c32f-4bf7-b52f-e5b941d49926","Type":"ContainerStarted","Data":"7d51111c2bb344009d7b5cc7fda863049b6898a75c61cc499ee7efeac56e76b1"} Oct 03 18:55:21 crc kubenswrapper[4835]: I1003 18:55:21.640605 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" podStartSLOduration=2.856564827 podStartE2EDuration="3.640584798s" podCreationTimestamp="2025-10-03 18:55:18 +0000 UTC" firstStartedPulling="2025-10-03 18:55:19.578282283 +0000 UTC m=+2461.294223155" lastFinishedPulling="2025-10-03 18:55:20.362302254 +0000 UTC m=+2462.078243126" observedRunningTime="2025-10-03 18:55:21.630653964 +0000 UTC m=+2463.346594866" watchObservedRunningTime="2025-10-03 18:55:21.640584798 +0000 UTC m=+2463.356525680" Oct 03 18:55:23 crc kubenswrapper[4835]: I1003 18:55:23.877815 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:55:23 crc kubenswrapper[4835]: E1003 18:55:23.878474 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:55:36 crc kubenswrapper[4835]: I1003 18:55:36.876921 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:55:36 crc kubenswrapper[4835]: E1003 18:55:36.877674 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:55:49 crc kubenswrapper[4835]: I1003 18:55:49.876990 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:55:49 crc kubenswrapper[4835]: E1003 18:55:49.877822 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:56:01 crc kubenswrapper[4835]: I1003 18:56:01.877043 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:56:01 crc kubenswrapper[4835]: E1003 18:56:01.877801 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:56:16 crc kubenswrapper[4835]: I1003 18:56:16.877311 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:56:16 crc kubenswrapper[4835]: E1003 18:56:16.878375 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:56:30 crc kubenswrapper[4835]: I1003 18:56:30.878968 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:56:30 crc kubenswrapper[4835]: E1003 18:56:30.879813 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 18:56:41 crc kubenswrapper[4835]: I1003 18:56:41.876453 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 18:56:42 crc kubenswrapper[4835]: I1003 18:56:42.335403 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"9398b6fcb039b8d8f5e49cb35136b0dc88780d3c5fd88ccc393e7775caa98125"} Oct 03 18:57:58 crc kubenswrapper[4835]: I1003 18:57:58.582400 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ml7kt"] Oct 03 18:57:58 crc kubenswrapper[4835]: I1003 18:57:58.585643 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:57:58 crc kubenswrapper[4835]: I1003 18:57:58.595459 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml7kt"] Oct 03 18:57:58 crc kubenswrapper[4835]: I1003 18:57:58.711586 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-utilities\") pod \"redhat-marketplace-ml7kt\" (UID: \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\") " pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:57:58 crc kubenswrapper[4835]: I1003 18:57:58.712209 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk5p7\" (UniqueName: \"kubernetes.io/projected/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-kube-api-access-fk5p7\") pod \"redhat-marketplace-ml7kt\" (UID: \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\") " pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:57:58 crc kubenswrapper[4835]: I1003 18:57:58.712304 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-catalog-content\") pod \"redhat-marketplace-ml7kt\" (UID: \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\") " pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:57:58 crc kubenswrapper[4835]: I1003 18:57:58.813772 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-utilities\") pod \"redhat-marketplace-ml7kt\" (UID: \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\") " pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:57:58 crc kubenswrapper[4835]: I1003 18:57:58.814040 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk5p7\" (UniqueName: \"kubernetes.io/projected/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-kube-api-access-fk5p7\") pod \"redhat-marketplace-ml7kt\" (UID: \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\") " pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:57:58 crc kubenswrapper[4835]: I1003 18:57:58.814109 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-catalog-content\") pod \"redhat-marketplace-ml7kt\" (UID: \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\") " pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:57:58 crc kubenswrapper[4835]: I1003 18:57:58.814526 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-catalog-content\") pod \"redhat-marketplace-ml7kt\" (UID: \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\") " pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:57:58 crc kubenswrapper[4835]: I1003 18:57:58.814746 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-utilities\") pod \"redhat-marketplace-ml7kt\" (UID: \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\") " pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:57:58 crc kubenswrapper[4835]: I1003 18:57:58.835010 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk5p7\" (UniqueName: \"kubernetes.io/projected/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-kube-api-access-fk5p7\") pod \"redhat-marketplace-ml7kt\" (UID: \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\") " pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:57:58 crc kubenswrapper[4835]: I1003 18:57:58.909126 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:57:59 crc kubenswrapper[4835]: I1003 18:57:59.413184 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml7kt"] Oct 03 18:58:00 crc kubenswrapper[4835]: I1003 18:58:00.099569 4835 generic.go:334] "Generic (PLEG): container finished" podID="cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" containerID="f1f4eea95d0147b3c91bac037a1be8db0b88d9b387808f45e1ca07e66cf30ed8" exitCode=0 Oct 03 18:58:00 crc kubenswrapper[4835]: I1003 18:58:00.099742 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml7kt" event={"ID":"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44","Type":"ContainerDied","Data":"f1f4eea95d0147b3c91bac037a1be8db0b88d9b387808f45e1ca07e66cf30ed8"} Oct 03 18:58:00 crc kubenswrapper[4835]: I1003 18:58:00.099847 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml7kt" event={"ID":"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44","Type":"ContainerStarted","Data":"72175a99d5703f1817a21d905752dec3f86f1da86654fa0f4a14c677770acdee"} Oct 03 18:58:01 crc kubenswrapper[4835]: I1003 18:58:01.110245 4835 generic.go:334] "Generic (PLEG): container finished" podID="cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" containerID="b61c4a4a0c703a5e1331989c3384a27add5bc79c7447c1764a869a3123848612" exitCode=0 Oct 03 18:58:01 crc kubenswrapper[4835]: I1003 18:58:01.110294 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml7kt" event={"ID":"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44","Type":"ContainerDied","Data":"b61c4a4a0c703a5e1331989c3384a27add5bc79c7447c1764a869a3123848612"} Oct 03 18:58:02 crc kubenswrapper[4835]: I1003 18:58:02.120986 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml7kt" event={"ID":"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44","Type":"ContainerStarted","Data":"acb5d97dcf436e563fdebaf3b986a434217fa0baa6721b74a8aaa306e5ca1e62"} Oct 03 18:58:02 crc kubenswrapper[4835]: I1003 18:58:02.144974 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ml7kt" podStartSLOduration=2.737926978 podStartE2EDuration="4.144956418s" podCreationTimestamp="2025-10-03 18:57:58 +0000 UTC" firstStartedPulling="2025-10-03 18:58:00.101815965 +0000 UTC m=+2621.817756837" lastFinishedPulling="2025-10-03 18:58:01.508845395 +0000 UTC m=+2623.224786277" observedRunningTime="2025-10-03 18:58:02.137498154 +0000 UTC m=+2623.853439026" watchObservedRunningTime="2025-10-03 18:58:02.144956418 +0000 UTC m=+2623.860897290" Oct 03 18:58:08 crc kubenswrapper[4835]: I1003 18:58:08.909224 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:58:08 crc kubenswrapper[4835]: I1003 18:58:08.909756 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:58:08 crc kubenswrapper[4835]: I1003 18:58:08.953466 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:58:09 crc kubenswrapper[4835]: I1003 18:58:09.227711 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:58:09 crc kubenswrapper[4835]: I1003 18:58:09.293370 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml7kt"] Oct 03 18:58:11 crc kubenswrapper[4835]: I1003 18:58:11.201599 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ml7kt" podUID="cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" containerName="registry-server" containerID="cri-o://acb5d97dcf436e563fdebaf3b986a434217fa0baa6721b74a8aaa306e5ca1e62" gracePeriod=2 Oct 03 18:58:11 crc kubenswrapper[4835]: I1003 18:58:11.675400 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:58:11 crc kubenswrapper[4835]: I1003 18:58:11.762872 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-utilities\") pod \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\" (UID: \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\") " Oct 03 18:58:11 crc kubenswrapper[4835]: I1003 18:58:11.762982 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk5p7\" (UniqueName: \"kubernetes.io/projected/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-kube-api-access-fk5p7\") pod \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\" (UID: \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\") " Oct 03 18:58:11 crc kubenswrapper[4835]: I1003 18:58:11.763030 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-catalog-content\") pod \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\" (UID: \"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44\") " Oct 03 18:58:11 crc kubenswrapper[4835]: I1003 18:58:11.763808 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-utilities" (OuterVolumeSpecName: "utilities") pod "cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" (UID: "cc0cf801-d6b3-47b8-b82e-6f5d6191bc44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:58:11 crc kubenswrapper[4835]: I1003 18:58:11.769034 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-kube-api-access-fk5p7" (OuterVolumeSpecName: "kube-api-access-fk5p7") pod "cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" (UID: "cc0cf801-d6b3-47b8-b82e-6f5d6191bc44"). InnerVolumeSpecName "kube-api-access-fk5p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:58:11 crc kubenswrapper[4835]: I1003 18:58:11.777229 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" (UID: "cc0cf801-d6b3-47b8-b82e-6f5d6191bc44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:58:11 crc kubenswrapper[4835]: I1003 18:58:11.865784 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:58:11 crc kubenswrapper[4835]: I1003 18:58:11.866259 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk5p7\" (UniqueName: \"kubernetes.io/projected/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-kube-api-access-fk5p7\") on node \"crc\" DevicePath \"\"" Oct 03 18:58:11 crc kubenswrapper[4835]: I1003 18:58:11.866278 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.213806 4835 generic.go:334] "Generic (PLEG): container finished" podID="cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" containerID="acb5d97dcf436e563fdebaf3b986a434217fa0baa6721b74a8aaa306e5ca1e62" exitCode=0 Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.213851 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml7kt" event={"ID":"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44","Type":"ContainerDied","Data":"acb5d97dcf436e563fdebaf3b986a434217fa0baa6721b74a8aaa306e5ca1e62"} Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.213877 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml7kt" event={"ID":"cc0cf801-d6b3-47b8-b82e-6f5d6191bc44","Type":"ContainerDied","Data":"72175a99d5703f1817a21d905752dec3f86f1da86654fa0f4a14c677770acdee"} Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.213893 4835 scope.go:117] "RemoveContainer" containerID="acb5d97dcf436e563fdebaf3b986a434217fa0baa6721b74a8aaa306e5ca1e62" Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.214012 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml7kt" Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.246191 4835 scope.go:117] "RemoveContainer" containerID="b61c4a4a0c703a5e1331989c3384a27add5bc79c7447c1764a869a3123848612" Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.250248 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml7kt"] Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.272518 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml7kt"] Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.292964 4835 scope.go:117] "RemoveContainer" containerID="f1f4eea95d0147b3c91bac037a1be8db0b88d9b387808f45e1ca07e66cf30ed8" Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.353852 4835 scope.go:117] "RemoveContainer" containerID="acb5d97dcf436e563fdebaf3b986a434217fa0baa6721b74a8aaa306e5ca1e62" Oct 03 18:58:12 crc kubenswrapper[4835]: E1003 18:58:12.354357 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb5d97dcf436e563fdebaf3b986a434217fa0baa6721b74a8aaa306e5ca1e62\": container with ID starting with acb5d97dcf436e563fdebaf3b986a434217fa0baa6721b74a8aaa306e5ca1e62 not found: ID does not exist" containerID="acb5d97dcf436e563fdebaf3b986a434217fa0baa6721b74a8aaa306e5ca1e62" Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.354407 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb5d97dcf436e563fdebaf3b986a434217fa0baa6721b74a8aaa306e5ca1e62"} err="failed to get container status \"acb5d97dcf436e563fdebaf3b986a434217fa0baa6721b74a8aaa306e5ca1e62\": rpc error: code = NotFound desc = could not find container \"acb5d97dcf436e563fdebaf3b986a434217fa0baa6721b74a8aaa306e5ca1e62\": container with ID starting with acb5d97dcf436e563fdebaf3b986a434217fa0baa6721b74a8aaa306e5ca1e62 not found: ID does not exist" Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.354437 4835 scope.go:117] "RemoveContainer" containerID="b61c4a4a0c703a5e1331989c3384a27add5bc79c7447c1764a869a3123848612" Oct 03 18:58:12 crc kubenswrapper[4835]: E1003 18:58:12.354782 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61c4a4a0c703a5e1331989c3384a27add5bc79c7447c1764a869a3123848612\": container with ID starting with b61c4a4a0c703a5e1331989c3384a27add5bc79c7447c1764a869a3123848612 not found: ID does not exist" containerID="b61c4a4a0c703a5e1331989c3384a27add5bc79c7447c1764a869a3123848612" Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.354807 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61c4a4a0c703a5e1331989c3384a27add5bc79c7447c1764a869a3123848612"} err="failed to get container status \"b61c4a4a0c703a5e1331989c3384a27add5bc79c7447c1764a869a3123848612\": rpc error: code = NotFound desc = could not find container \"b61c4a4a0c703a5e1331989c3384a27add5bc79c7447c1764a869a3123848612\": container with ID starting with b61c4a4a0c703a5e1331989c3384a27add5bc79c7447c1764a869a3123848612 not found: ID does not exist" Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.354822 4835 scope.go:117] "RemoveContainer" containerID="f1f4eea95d0147b3c91bac037a1be8db0b88d9b387808f45e1ca07e66cf30ed8" Oct 03 18:58:12 crc kubenswrapper[4835]: E1003 18:58:12.355054 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f4eea95d0147b3c91bac037a1be8db0b88d9b387808f45e1ca07e66cf30ed8\": container with ID starting with f1f4eea95d0147b3c91bac037a1be8db0b88d9b387808f45e1ca07e66cf30ed8 not found: ID does not exist" containerID="f1f4eea95d0147b3c91bac037a1be8db0b88d9b387808f45e1ca07e66cf30ed8" Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.355092 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f4eea95d0147b3c91bac037a1be8db0b88d9b387808f45e1ca07e66cf30ed8"} err="failed to get container status \"f1f4eea95d0147b3c91bac037a1be8db0b88d9b387808f45e1ca07e66cf30ed8\": rpc error: code = NotFound desc = could not find container \"f1f4eea95d0147b3c91bac037a1be8db0b88d9b387808f45e1ca07e66cf30ed8\": container with ID starting with f1f4eea95d0147b3c91bac037a1be8db0b88d9b387808f45e1ca07e66cf30ed8 not found: ID does not exist" Oct 03 18:58:12 crc kubenswrapper[4835]: I1003 18:58:12.894808 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" path="/var/lib/kubelet/pods/cc0cf801-d6b3-47b8-b82e-6f5d6191bc44/volumes" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.454494 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tn4g6"] Oct 03 18:58:47 crc kubenswrapper[4835]: E1003 18:58:47.455408 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" containerName="extract-content" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.455420 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" containerName="extract-content" Oct 03 18:58:47 crc kubenswrapper[4835]: E1003 18:58:47.455436 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" containerName="extract-utilities" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.455442 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" containerName="extract-utilities" Oct 03 18:58:47 crc kubenswrapper[4835]: E1003 18:58:47.455472 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" containerName="registry-server" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.455479 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" containerName="registry-server" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.455689 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc0cf801-d6b3-47b8-b82e-6f5d6191bc44" containerName="registry-server" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.457274 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.487136 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tn4g6"] Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.588999 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bab0ea50-e57b-45c0-82db-8ee038afb28d-catalog-content\") pod \"redhat-operators-tn4g6\" (UID: \"bab0ea50-e57b-45c0-82db-8ee038afb28d\") " pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.589041 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bab0ea50-e57b-45c0-82db-8ee038afb28d-utilities\") pod \"redhat-operators-tn4g6\" (UID: \"bab0ea50-e57b-45c0-82db-8ee038afb28d\") " pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.589166 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhgdl\" (UniqueName: \"kubernetes.io/projected/bab0ea50-e57b-45c0-82db-8ee038afb28d-kube-api-access-zhgdl\") pod \"redhat-operators-tn4g6\" (UID: \"bab0ea50-e57b-45c0-82db-8ee038afb28d\") " pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.690838 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bab0ea50-e57b-45c0-82db-8ee038afb28d-catalog-content\") pod \"redhat-operators-tn4g6\" (UID: \"bab0ea50-e57b-45c0-82db-8ee038afb28d\") " pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.690887 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bab0ea50-e57b-45c0-82db-8ee038afb28d-utilities\") pod \"redhat-operators-tn4g6\" (UID: \"bab0ea50-e57b-45c0-82db-8ee038afb28d\") " pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.690963 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhgdl\" (UniqueName: \"kubernetes.io/projected/bab0ea50-e57b-45c0-82db-8ee038afb28d-kube-api-access-zhgdl\") pod \"redhat-operators-tn4g6\" (UID: \"bab0ea50-e57b-45c0-82db-8ee038afb28d\") " pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.691442 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bab0ea50-e57b-45c0-82db-8ee038afb28d-catalog-content\") pod \"redhat-operators-tn4g6\" (UID: \"bab0ea50-e57b-45c0-82db-8ee038afb28d\") " pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.691648 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bab0ea50-e57b-45c0-82db-8ee038afb28d-utilities\") pod \"redhat-operators-tn4g6\" (UID: \"bab0ea50-e57b-45c0-82db-8ee038afb28d\") " pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.719030 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhgdl\" (UniqueName: \"kubernetes.io/projected/bab0ea50-e57b-45c0-82db-8ee038afb28d-kube-api-access-zhgdl\") pod \"redhat-operators-tn4g6\" (UID: \"bab0ea50-e57b-45c0-82db-8ee038afb28d\") " pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:47 crc kubenswrapper[4835]: I1003 18:58:47.787095 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:48 crc kubenswrapper[4835]: I1003 18:58:48.250818 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tn4g6"] Oct 03 18:58:48 crc kubenswrapper[4835]: I1003 18:58:48.532273 4835 generic.go:334] "Generic (PLEG): container finished" podID="bab0ea50-e57b-45c0-82db-8ee038afb28d" containerID="d003511db5032b6dcbec259a1980aeffa63769985be761f1849503675b58e93d" exitCode=0 Oct 03 18:58:48 crc kubenswrapper[4835]: I1003 18:58:48.532351 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tn4g6" event={"ID":"bab0ea50-e57b-45c0-82db-8ee038afb28d","Type":"ContainerDied","Data":"d003511db5032b6dcbec259a1980aeffa63769985be761f1849503675b58e93d"} Oct 03 18:58:48 crc kubenswrapper[4835]: I1003 18:58:48.532582 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tn4g6" event={"ID":"bab0ea50-e57b-45c0-82db-8ee038afb28d","Type":"ContainerStarted","Data":"c6e402db0e84e3584d8425cfbee07cd95e861edcdfb7b08ef170bb5b6f433d48"} Oct 03 18:58:49 crc kubenswrapper[4835]: I1003 18:58:49.544444 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tn4g6" event={"ID":"bab0ea50-e57b-45c0-82db-8ee038afb28d","Type":"ContainerStarted","Data":"e1e5de773e6ac58f69e26c978ff5221b7577f063f46b809890847490e930b7dc"} Oct 03 18:58:50 crc kubenswrapper[4835]: I1003 18:58:50.561575 4835 generic.go:334] "Generic (PLEG): container finished" podID="bab0ea50-e57b-45c0-82db-8ee038afb28d" containerID="e1e5de773e6ac58f69e26c978ff5221b7577f063f46b809890847490e930b7dc" exitCode=0 Oct 03 18:58:50 crc kubenswrapper[4835]: I1003 18:58:50.562026 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tn4g6" event={"ID":"bab0ea50-e57b-45c0-82db-8ee038afb28d","Type":"ContainerDied","Data":"e1e5de773e6ac58f69e26c978ff5221b7577f063f46b809890847490e930b7dc"} Oct 03 18:58:51 crc kubenswrapper[4835]: I1003 18:58:51.574136 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tn4g6" event={"ID":"bab0ea50-e57b-45c0-82db-8ee038afb28d","Type":"ContainerStarted","Data":"bab8c94192d4dd9a909da026090272899b201584ba25c75cdd2dc70edb5c11cb"} Oct 03 18:58:51 crc kubenswrapper[4835]: I1003 18:58:51.593344 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tn4g6" podStartSLOduration=2.13229469 podStartE2EDuration="4.593328042s" podCreationTimestamp="2025-10-03 18:58:47 +0000 UTC" firstStartedPulling="2025-10-03 18:58:48.533886808 +0000 UTC m=+2670.249827680" lastFinishedPulling="2025-10-03 18:58:50.99492016 +0000 UTC m=+2672.710861032" observedRunningTime="2025-10-03 18:58:51.590158354 +0000 UTC m=+2673.306099226" watchObservedRunningTime="2025-10-03 18:58:51.593328042 +0000 UTC m=+2673.309268914" Oct 03 18:58:57 crc kubenswrapper[4835]: I1003 18:58:57.787703 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:57 crc kubenswrapper[4835]: I1003 18:58:57.788247 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:57 crc kubenswrapper[4835]: I1003 18:58:57.844458 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:58 crc kubenswrapper[4835]: I1003 18:58:58.680807 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:58:58 crc kubenswrapper[4835]: I1003 18:58:58.730052 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tn4g6"] Oct 03 18:59:00 crc kubenswrapper[4835]: I1003 18:59:00.648771 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tn4g6" podUID="bab0ea50-e57b-45c0-82db-8ee038afb28d" containerName="registry-server" containerID="cri-o://bab8c94192d4dd9a909da026090272899b201584ba25c75cdd2dc70edb5c11cb" gracePeriod=2 Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.093863 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.193639 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bab0ea50-e57b-45c0-82db-8ee038afb28d-catalog-content\") pod \"bab0ea50-e57b-45c0-82db-8ee038afb28d\" (UID: \"bab0ea50-e57b-45c0-82db-8ee038afb28d\") " Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.193729 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhgdl\" (UniqueName: \"kubernetes.io/projected/bab0ea50-e57b-45c0-82db-8ee038afb28d-kube-api-access-zhgdl\") pod \"bab0ea50-e57b-45c0-82db-8ee038afb28d\" (UID: \"bab0ea50-e57b-45c0-82db-8ee038afb28d\") " Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.193807 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bab0ea50-e57b-45c0-82db-8ee038afb28d-utilities\") pod \"bab0ea50-e57b-45c0-82db-8ee038afb28d\" (UID: \"bab0ea50-e57b-45c0-82db-8ee038afb28d\") " Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.194706 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bab0ea50-e57b-45c0-82db-8ee038afb28d-utilities" (OuterVolumeSpecName: "utilities") pod "bab0ea50-e57b-45c0-82db-8ee038afb28d" (UID: "bab0ea50-e57b-45c0-82db-8ee038afb28d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.199746 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bab0ea50-e57b-45c0-82db-8ee038afb28d-kube-api-access-zhgdl" (OuterVolumeSpecName: "kube-api-access-zhgdl") pod "bab0ea50-e57b-45c0-82db-8ee038afb28d" (UID: "bab0ea50-e57b-45c0-82db-8ee038afb28d"). InnerVolumeSpecName "kube-api-access-zhgdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.271653 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bab0ea50-e57b-45c0-82db-8ee038afb28d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bab0ea50-e57b-45c0-82db-8ee038afb28d" (UID: "bab0ea50-e57b-45c0-82db-8ee038afb28d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.296531 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bab0ea50-e57b-45c0-82db-8ee038afb28d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.296855 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bab0ea50-e57b-45c0-82db-8ee038afb28d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.297032 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhgdl\" (UniqueName: \"kubernetes.io/projected/bab0ea50-e57b-45c0-82db-8ee038afb28d-kube-api-access-zhgdl\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.661887 4835 generic.go:334] "Generic (PLEG): container finished" podID="bab0ea50-e57b-45c0-82db-8ee038afb28d" containerID="bab8c94192d4dd9a909da026090272899b201584ba25c75cdd2dc70edb5c11cb" exitCode=0 Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.661935 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tn4g6" event={"ID":"bab0ea50-e57b-45c0-82db-8ee038afb28d","Type":"ContainerDied","Data":"bab8c94192d4dd9a909da026090272899b201584ba25c75cdd2dc70edb5c11cb"} Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.661950 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tn4g6" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.661965 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tn4g6" event={"ID":"bab0ea50-e57b-45c0-82db-8ee038afb28d","Type":"ContainerDied","Data":"c6e402db0e84e3584d8425cfbee07cd95e861edcdfb7b08ef170bb5b6f433d48"} Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.661991 4835 scope.go:117] "RemoveContainer" containerID="bab8c94192d4dd9a909da026090272899b201584ba25c75cdd2dc70edb5c11cb" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.682412 4835 scope.go:117] "RemoveContainer" containerID="e1e5de773e6ac58f69e26c978ff5221b7577f063f46b809890847490e930b7dc" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.714388 4835 scope.go:117] "RemoveContainer" containerID="d003511db5032b6dcbec259a1980aeffa63769985be761f1849503675b58e93d" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.718219 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tn4g6"] Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.728830 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tn4g6"] Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.761521 4835 scope.go:117] "RemoveContainer" containerID="bab8c94192d4dd9a909da026090272899b201584ba25c75cdd2dc70edb5c11cb" Oct 03 18:59:01 crc kubenswrapper[4835]: E1003 18:59:01.762015 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab8c94192d4dd9a909da026090272899b201584ba25c75cdd2dc70edb5c11cb\": container with ID starting with bab8c94192d4dd9a909da026090272899b201584ba25c75cdd2dc70edb5c11cb not found: ID does not exist" containerID="bab8c94192d4dd9a909da026090272899b201584ba25c75cdd2dc70edb5c11cb" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.762059 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab8c94192d4dd9a909da026090272899b201584ba25c75cdd2dc70edb5c11cb"} err="failed to get container status \"bab8c94192d4dd9a909da026090272899b201584ba25c75cdd2dc70edb5c11cb\": rpc error: code = NotFound desc = could not find container \"bab8c94192d4dd9a909da026090272899b201584ba25c75cdd2dc70edb5c11cb\": container with ID starting with bab8c94192d4dd9a909da026090272899b201584ba25c75cdd2dc70edb5c11cb not found: ID does not exist" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.762103 4835 scope.go:117] "RemoveContainer" containerID="e1e5de773e6ac58f69e26c978ff5221b7577f063f46b809890847490e930b7dc" Oct 03 18:59:01 crc kubenswrapper[4835]: E1003 18:59:01.762670 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e5de773e6ac58f69e26c978ff5221b7577f063f46b809890847490e930b7dc\": container with ID starting with e1e5de773e6ac58f69e26c978ff5221b7577f063f46b809890847490e930b7dc not found: ID does not exist" containerID="e1e5de773e6ac58f69e26c978ff5221b7577f063f46b809890847490e930b7dc" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.762713 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e5de773e6ac58f69e26c978ff5221b7577f063f46b809890847490e930b7dc"} err="failed to get container status \"e1e5de773e6ac58f69e26c978ff5221b7577f063f46b809890847490e930b7dc\": rpc error: code = NotFound desc = could not find container \"e1e5de773e6ac58f69e26c978ff5221b7577f063f46b809890847490e930b7dc\": container with ID starting with e1e5de773e6ac58f69e26c978ff5221b7577f063f46b809890847490e930b7dc not found: ID does not exist" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.762739 4835 scope.go:117] "RemoveContainer" containerID="d003511db5032b6dcbec259a1980aeffa63769985be761f1849503675b58e93d" Oct 03 18:59:01 crc kubenswrapper[4835]: E1003 18:59:01.766829 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d003511db5032b6dcbec259a1980aeffa63769985be761f1849503675b58e93d\": container with ID starting with d003511db5032b6dcbec259a1980aeffa63769985be761f1849503675b58e93d not found: ID does not exist" containerID="d003511db5032b6dcbec259a1980aeffa63769985be761f1849503675b58e93d" Oct 03 18:59:01 crc kubenswrapper[4835]: I1003 18:59:01.766873 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d003511db5032b6dcbec259a1980aeffa63769985be761f1849503675b58e93d"} err="failed to get container status \"d003511db5032b6dcbec259a1980aeffa63769985be761f1849503675b58e93d\": rpc error: code = NotFound desc = could not find container \"d003511db5032b6dcbec259a1980aeffa63769985be761f1849503675b58e93d\": container with ID starting with d003511db5032b6dcbec259a1980aeffa63769985be761f1849503675b58e93d not found: ID does not exist" Oct 03 18:59:02 crc kubenswrapper[4835]: I1003 18:59:02.888862 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bab0ea50-e57b-45c0-82db-8ee038afb28d" path="/var/lib/kubelet/pods/bab0ea50-e57b-45c0-82db-8ee038afb28d/volumes" Oct 03 18:59:04 crc kubenswrapper[4835]: I1003 18:59:04.693412 4835 generic.go:334] "Generic (PLEG): container finished" podID="55d0501b-c32f-4bf7-b52f-e5b941d49926" containerID="7d51111c2bb344009d7b5cc7fda863049b6898a75c61cc499ee7efeac56e76b1" exitCode=0 Oct 03 18:59:04 crc kubenswrapper[4835]: I1003 18:59:04.693497 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" event={"ID":"55d0501b-c32f-4bf7-b52f-e5b941d49926","Type":"ContainerDied","Data":"7d51111c2bb344009d7b5cc7fda863049b6898a75c61cc499ee7efeac56e76b1"} Oct 03 18:59:05 crc kubenswrapper[4835]: I1003 18:59:05.358906 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:59:05 crc kubenswrapper[4835]: I1003 18:59:05.358982 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.076559 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.190804 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-ssh-key\") pod \"55d0501b-c32f-4bf7-b52f-e5b941d49926\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.190885 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-inventory\") pod \"55d0501b-c32f-4bf7-b52f-e5b941d49926\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.190918 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-extra-config-0\") pod \"55d0501b-c32f-4bf7-b52f-e5b941d49926\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.190958 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-migration-ssh-key-0\") pod \"55d0501b-c32f-4bf7-b52f-e5b941d49926\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.191025 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-migration-ssh-key-1\") pod \"55d0501b-c32f-4bf7-b52f-e5b941d49926\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.191053 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-combined-ca-bundle\") pod \"55d0501b-c32f-4bf7-b52f-e5b941d49926\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.191110 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-cell1-compute-config-1\") pod \"55d0501b-c32f-4bf7-b52f-e5b941d49926\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.191184 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl2qx\" (UniqueName: \"kubernetes.io/projected/55d0501b-c32f-4bf7-b52f-e5b941d49926-kube-api-access-sl2qx\") pod \"55d0501b-c32f-4bf7-b52f-e5b941d49926\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.191236 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-cell1-compute-config-0\") pod \"55d0501b-c32f-4bf7-b52f-e5b941d49926\" (UID: \"55d0501b-c32f-4bf7-b52f-e5b941d49926\") " Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.196961 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d0501b-c32f-4bf7-b52f-e5b941d49926-kube-api-access-sl2qx" (OuterVolumeSpecName: "kube-api-access-sl2qx") pod "55d0501b-c32f-4bf7-b52f-e5b941d49926" (UID: "55d0501b-c32f-4bf7-b52f-e5b941d49926"). InnerVolumeSpecName "kube-api-access-sl2qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.198090 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "55d0501b-c32f-4bf7-b52f-e5b941d49926" (UID: "55d0501b-c32f-4bf7-b52f-e5b941d49926"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.221719 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "55d0501b-c32f-4bf7-b52f-e5b941d49926" (UID: "55d0501b-c32f-4bf7-b52f-e5b941d49926"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.221781 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "55d0501b-c32f-4bf7-b52f-e5b941d49926" (UID: "55d0501b-c32f-4bf7-b52f-e5b941d49926"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.223072 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-inventory" (OuterVolumeSpecName: "inventory") pod "55d0501b-c32f-4bf7-b52f-e5b941d49926" (UID: "55d0501b-c32f-4bf7-b52f-e5b941d49926"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.230037 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "55d0501b-c32f-4bf7-b52f-e5b941d49926" (UID: "55d0501b-c32f-4bf7-b52f-e5b941d49926"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.230787 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "55d0501b-c32f-4bf7-b52f-e5b941d49926" (UID: "55d0501b-c32f-4bf7-b52f-e5b941d49926"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.230950 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "55d0501b-c32f-4bf7-b52f-e5b941d49926" (UID: "55d0501b-c32f-4bf7-b52f-e5b941d49926"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.233035 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "55d0501b-c32f-4bf7-b52f-e5b941d49926" (UID: "55d0501b-c32f-4bf7-b52f-e5b941d49926"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.293860 4835 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.293904 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.293918 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.293932 4835 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.293945 4835 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.293958 4835 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.293971 4835 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.293982 4835 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/55d0501b-c32f-4bf7-b52f-e5b941d49926-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.293994 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl2qx\" (UniqueName: \"kubernetes.io/projected/55d0501b-c32f-4bf7-b52f-e5b941d49926-kube-api-access-sl2qx\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.711589 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" event={"ID":"55d0501b-c32f-4bf7-b52f-e5b941d49926","Type":"ContainerDied","Data":"5796d60f70fdb98e447f8388a2236f17578069ee8fc181b35ca95efe78913df9"} Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.711637 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5796d60f70fdb98e447f8388a2236f17578069ee8fc181b35ca95efe78913df9" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.711712 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-h27nb" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.804419 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp"] Oct 03 18:59:06 crc kubenswrapper[4835]: E1003 18:59:06.805098 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab0ea50-e57b-45c0-82db-8ee038afb28d" containerName="extract-utilities" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.805182 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab0ea50-e57b-45c0-82db-8ee038afb28d" containerName="extract-utilities" Oct 03 18:59:06 crc kubenswrapper[4835]: E1003 18:59:06.805257 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab0ea50-e57b-45c0-82db-8ee038afb28d" containerName="extract-content" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.805317 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab0ea50-e57b-45c0-82db-8ee038afb28d" containerName="extract-content" Oct 03 18:59:06 crc kubenswrapper[4835]: E1003 18:59:06.805388 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab0ea50-e57b-45c0-82db-8ee038afb28d" containerName="registry-server" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.805443 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab0ea50-e57b-45c0-82db-8ee038afb28d" containerName="registry-server" Oct 03 18:59:06 crc kubenswrapper[4835]: E1003 18:59:06.805526 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d0501b-c32f-4bf7-b52f-e5b941d49926" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.805591 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d0501b-c32f-4bf7-b52f-e5b941d49926" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.805833 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab0ea50-e57b-45c0-82db-8ee038afb28d" containerName="registry-server" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.805911 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d0501b-c32f-4bf7-b52f-e5b941d49926" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.806648 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.810136 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.810254 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-bbktf" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.810772 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.811545 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.813281 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.819291 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp"] Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.905865 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7vvx\" (UniqueName: \"kubernetes.io/projected/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-kube-api-access-q7vvx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.905983 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.906098 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.906135 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.906157 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.906177 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:06 crc kubenswrapper[4835]: I1003 18:59:06.906248 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.007323 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7vvx\" (UniqueName: \"kubernetes.io/projected/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-kube-api-access-q7vvx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.007412 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.007498 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.007529 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.007553 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.007581 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.007652 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.011130 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.011265 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.011985 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.012747 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.014117 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.015366 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.025170 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7vvx\" (UniqueName: \"kubernetes.io/projected/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-kube-api-access-q7vvx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.125164 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.663324 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp"] Oct 03 18:59:07 crc kubenswrapper[4835]: W1003 18:59:07.674151 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2406a66_d20f_4ac5_9817_a1bf1ff38c5d.slice/crio-7fc77c50fd21328daf1df597f24bff4d20b510d649f57effad834ea19eb252d6 WatchSource:0}: Error finding container 7fc77c50fd21328daf1df597f24bff4d20b510d649f57effad834ea19eb252d6: Status 404 returned error can't find the container with id 7fc77c50fd21328daf1df597f24bff4d20b510d649f57effad834ea19eb252d6 Oct 03 18:59:07 crc kubenswrapper[4835]: I1003 18:59:07.727971 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" event={"ID":"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d","Type":"ContainerStarted","Data":"7fc77c50fd21328daf1df597f24bff4d20b510d649f57effad834ea19eb252d6"} Oct 03 18:59:09 crc kubenswrapper[4835]: I1003 18:59:09.747451 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" event={"ID":"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d","Type":"ContainerStarted","Data":"6d5db5efc55a51cc643e54380fa0c9eea8861d282478139847b36f1996f1b77a"} Oct 03 18:59:09 crc kubenswrapper[4835]: I1003 18:59:09.771605 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" podStartSLOduration=2.946919099 podStartE2EDuration="3.771582992s" podCreationTimestamp="2025-10-03 18:59:06 +0000 UTC" firstStartedPulling="2025-10-03 18:59:07.67938098 +0000 UTC m=+2689.395321862" lastFinishedPulling="2025-10-03 18:59:08.504044883 +0000 UTC m=+2690.219985755" observedRunningTime="2025-10-03 18:59:09.761866923 +0000 UTC m=+2691.477807795" watchObservedRunningTime="2025-10-03 18:59:09.771582992 +0000 UTC m=+2691.487523864" Oct 03 18:59:35 crc kubenswrapper[4835]: I1003 18:59:35.358120 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 18:59:35 crc kubenswrapper[4835]: I1003 18:59:35.358866 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 18:59:41 crc kubenswrapper[4835]: I1003 18:59:41.891959 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lvm2t"] Oct 03 18:59:41 crc kubenswrapper[4835]: I1003 18:59:41.896243 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:41 crc kubenswrapper[4835]: I1003 18:59:41.910741 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lvm2t"] Oct 03 18:59:42 crc kubenswrapper[4835]: I1003 18:59:42.026578 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbcvt\" (UniqueName: \"kubernetes.io/projected/95a76d39-4dd7-411f-9c40-dc445a9cf4db-kube-api-access-sbcvt\") pod \"certified-operators-lvm2t\" (UID: \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\") " pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:42 crc kubenswrapper[4835]: I1003 18:59:42.026947 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a76d39-4dd7-411f-9c40-dc445a9cf4db-utilities\") pod \"certified-operators-lvm2t\" (UID: \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\") " pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:42 crc kubenswrapper[4835]: I1003 18:59:42.027016 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a76d39-4dd7-411f-9c40-dc445a9cf4db-catalog-content\") pod \"certified-operators-lvm2t\" (UID: \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\") " pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:42 crc kubenswrapper[4835]: I1003 18:59:42.128636 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbcvt\" (UniqueName: \"kubernetes.io/projected/95a76d39-4dd7-411f-9c40-dc445a9cf4db-kube-api-access-sbcvt\") pod \"certified-operators-lvm2t\" (UID: \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\") " pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:42 crc kubenswrapper[4835]: I1003 18:59:42.128749 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a76d39-4dd7-411f-9c40-dc445a9cf4db-utilities\") pod \"certified-operators-lvm2t\" (UID: \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\") " pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:42 crc kubenswrapper[4835]: I1003 18:59:42.128848 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a76d39-4dd7-411f-9c40-dc445a9cf4db-catalog-content\") pod \"certified-operators-lvm2t\" (UID: \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\") " pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:42 crc kubenswrapper[4835]: I1003 18:59:42.129372 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a76d39-4dd7-411f-9c40-dc445a9cf4db-catalog-content\") pod \"certified-operators-lvm2t\" (UID: \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\") " pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:42 crc kubenswrapper[4835]: I1003 18:59:42.129499 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a76d39-4dd7-411f-9c40-dc445a9cf4db-utilities\") pod \"certified-operators-lvm2t\" (UID: \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\") " pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:42 crc kubenswrapper[4835]: I1003 18:59:42.157205 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbcvt\" (UniqueName: \"kubernetes.io/projected/95a76d39-4dd7-411f-9c40-dc445a9cf4db-kube-api-access-sbcvt\") pod \"certified-operators-lvm2t\" (UID: \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\") " pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:42 crc kubenswrapper[4835]: I1003 18:59:42.220566 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:42 crc kubenswrapper[4835]: I1003 18:59:42.729960 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lvm2t"] Oct 03 18:59:43 crc kubenswrapper[4835]: I1003 18:59:43.051856 4835 generic.go:334] "Generic (PLEG): container finished" podID="95a76d39-4dd7-411f-9c40-dc445a9cf4db" containerID="971d6c05653d62852c19c602c9b93affcfda6ba6423c5476f151c10d488695dc" exitCode=0 Oct 03 18:59:43 crc kubenswrapper[4835]: I1003 18:59:43.053099 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvm2t" event={"ID":"95a76d39-4dd7-411f-9c40-dc445a9cf4db","Type":"ContainerDied","Data":"971d6c05653d62852c19c602c9b93affcfda6ba6423c5476f151c10d488695dc"} Oct 03 18:59:43 crc kubenswrapper[4835]: I1003 18:59:43.053138 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvm2t" event={"ID":"95a76d39-4dd7-411f-9c40-dc445a9cf4db","Type":"ContainerStarted","Data":"7b2962bb52c7ccc8298367cbb9459817d319b9449471ea88cdccd5f0e8b8ff82"} Oct 03 18:59:44 crc kubenswrapper[4835]: I1003 18:59:44.297202 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7blw6"] Oct 03 18:59:44 crc kubenswrapper[4835]: I1003 18:59:44.302403 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:44 crc kubenswrapper[4835]: I1003 18:59:44.315183 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7blw6"] Oct 03 18:59:44 crc kubenswrapper[4835]: I1003 18:59:44.379144 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-catalog-content\") pod \"community-operators-7blw6\" (UID: \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\") " pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:44 crc kubenswrapper[4835]: I1003 18:59:44.379199 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-utilities\") pod \"community-operators-7blw6\" (UID: \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\") " pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:44 crc kubenswrapper[4835]: I1003 18:59:44.379222 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lksks\" (UniqueName: \"kubernetes.io/projected/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-kube-api-access-lksks\") pod \"community-operators-7blw6\" (UID: \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\") " pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:44 crc kubenswrapper[4835]: I1003 18:59:44.480300 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-catalog-content\") pod \"community-operators-7blw6\" (UID: \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\") " pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:44 crc kubenswrapper[4835]: I1003 18:59:44.480349 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-utilities\") pod \"community-operators-7blw6\" (UID: \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\") " pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:44 crc kubenswrapper[4835]: I1003 18:59:44.480369 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lksks\" (UniqueName: \"kubernetes.io/projected/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-kube-api-access-lksks\") pod \"community-operators-7blw6\" (UID: \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\") " pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:44 crc kubenswrapper[4835]: I1003 18:59:44.481147 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-catalog-content\") pod \"community-operators-7blw6\" (UID: \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\") " pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:44 crc kubenswrapper[4835]: I1003 18:59:44.481251 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-utilities\") pod \"community-operators-7blw6\" (UID: \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\") " pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:44 crc kubenswrapper[4835]: I1003 18:59:44.504487 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lksks\" (UniqueName: \"kubernetes.io/projected/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-kube-api-access-lksks\") pod \"community-operators-7blw6\" (UID: \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\") " pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:44 crc kubenswrapper[4835]: I1003 18:59:44.631787 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:45 crc kubenswrapper[4835]: I1003 18:59:45.076496 4835 generic.go:334] "Generic (PLEG): container finished" podID="95a76d39-4dd7-411f-9c40-dc445a9cf4db" containerID="b00640ac251577c12d8f5c4207b33385617cc4d27eff1ac37416f168414518d4" exitCode=0 Oct 03 18:59:45 crc kubenswrapper[4835]: I1003 18:59:45.076556 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvm2t" event={"ID":"95a76d39-4dd7-411f-9c40-dc445a9cf4db","Type":"ContainerDied","Data":"b00640ac251577c12d8f5c4207b33385617cc4d27eff1ac37416f168414518d4"} Oct 03 18:59:45 crc kubenswrapper[4835]: I1003 18:59:45.198992 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7blw6"] Oct 03 18:59:45 crc kubenswrapper[4835]: W1003 18:59:45.200338 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f8518d3_7fd0_45d4_ac23_d063f79aacd8.slice/crio-5783a7cf50afb809a9d6f3677132dd5ac1a318899070c19924e97811c056012f WatchSource:0}: Error finding container 5783a7cf50afb809a9d6f3677132dd5ac1a318899070c19924e97811c056012f: Status 404 returned error can't find the container with id 5783a7cf50afb809a9d6f3677132dd5ac1a318899070c19924e97811c056012f Oct 03 18:59:46 crc kubenswrapper[4835]: I1003 18:59:46.088121 4835 generic.go:334] "Generic (PLEG): container finished" podID="4f8518d3-7fd0-45d4-ac23-d063f79aacd8" containerID="08db49a5885b66fd4d468d77be3b5acb7cfdedbc41d137703b57674b8bb941fc" exitCode=0 Oct 03 18:59:46 crc kubenswrapper[4835]: I1003 18:59:46.088297 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7blw6" event={"ID":"4f8518d3-7fd0-45d4-ac23-d063f79aacd8","Type":"ContainerDied","Data":"08db49a5885b66fd4d468d77be3b5acb7cfdedbc41d137703b57674b8bb941fc"} Oct 03 18:59:46 crc kubenswrapper[4835]: I1003 18:59:46.088692 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7blw6" event={"ID":"4f8518d3-7fd0-45d4-ac23-d063f79aacd8","Type":"ContainerStarted","Data":"5783a7cf50afb809a9d6f3677132dd5ac1a318899070c19924e97811c056012f"} Oct 03 18:59:46 crc kubenswrapper[4835]: I1003 18:59:46.091928 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvm2t" event={"ID":"95a76d39-4dd7-411f-9c40-dc445a9cf4db","Type":"ContainerStarted","Data":"783ddbb3f9f39bdfe0674a14a002ceb5c9aec59251f35bc321a7d2b09930c372"} Oct 03 18:59:47 crc kubenswrapper[4835]: I1003 18:59:47.107049 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7blw6" event={"ID":"4f8518d3-7fd0-45d4-ac23-d063f79aacd8","Type":"ContainerStarted","Data":"ccf14c097b0c918b65ab9e7eafa1ddf151b89ad1dbb08dffe9fa14703de16f95"} Oct 03 18:59:47 crc kubenswrapper[4835]: I1003 18:59:47.139463 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lvm2t" podStartSLOduration=3.650178634 podStartE2EDuration="6.139443422s" podCreationTimestamp="2025-10-03 18:59:41 +0000 UTC" firstStartedPulling="2025-10-03 18:59:43.053381253 +0000 UTC m=+2724.769322125" lastFinishedPulling="2025-10-03 18:59:45.542646041 +0000 UTC m=+2727.258586913" observedRunningTime="2025-10-03 18:59:46.13851682 +0000 UTC m=+2727.854457692" watchObservedRunningTime="2025-10-03 18:59:47.139443422 +0000 UTC m=+2728.855384294" Oct 03 18:59:48 crc kubenswrapper[4835]: I1003 18:59:48.117457 4835 generic.go:334] "Generic (PLEG): container finished" podID="4f8518d3-7fd0-45d4-ac23-d063f79aacd8" containerID="ccf14c097b0c918b65ab9e7eafa1ddf151b89ad1dbb08dffe9fa14703de16f95" exitCode=0 Oct 03 18:59:48 crc kubenswrapper[4835]: I1003 18:59:48.118171 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7blw6" event={"ID":"4f8518d3-7fd0-45d4-ac23-d063f79aacd8","Type":"ContainerDied","Data":"ccf14c097b0c918b65ab9e7eafa1ddf151b89ad1dbb08dffe9fa14703de16f95"} Oct 03 18:59:50 crc kubenswrapper[4835]: I1003 18:59:50.142134 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7blw6" event={"ID":"4f8518d3-7fd0-45d4-ac23-d063f79aacd8","Type":"ContainerStarted","Data":"1e066e32d7dca414590f96df703267515b13acc5c215f401b26fd6d9d082d7fd"} Oct 03 18:59:50 crc kubenswrapper[4835]: I1003 18:59:50.167402 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7blw6" podStartSLOduration=2.961477382 podStartE2EDuration="6.167383019s" podCreationTimestamp="2025-10-03 18:59:44 +0000 UTC" firstStartedPulling="2025-10-03 18:59:46.09110906 +0000 UTC m=+2727.807049922" lastFinishedPulling="2025-10-03 18:59:49.297014687 +0000 UTC m=+2731.012955559" observedRunningTime="2025-10-03 18:59:50.157908635 +0000 UTC m=+2731.873849507" watchObservedRunningTime="2025-10-03 18:59:50.167383019 +0000 UTC m=+2731.883323891" Oct 03 18:59:52 crc kubenswrapper[4835]: I1003 18:59:52.221715 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:52 crc kubenswrapper[4835]: I1003 18:59:52.223387 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:52 crc kubenswrapper[4835]: I1003 18:59:52.265919 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:53 crc kubenswrapper[4835]: I1003 18:59:53.215863 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:53 crc kubenswrapper[4835]: I1003 18:59:53.283740 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lvm2t"] Oct 03 18:59:54 crc kubenswrapper[4835]: I1003 18:59:54.633320 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:54 crc kubenswrapper[4835]: I1003 18:59:54.634527 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:54 crc kubenswrapper[4835]: I1003 18:59:54.680181 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:55 crc kubenswrapper[4835]: I1003 18:59:55.186206 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lvm2t" podUID="95a76d39-4dd7-411f-9c40-dc445a9cf4db" containerName="registry-server" containerID="cri-o://783ddbb3f9f39bdfe0674a14a002ceb5c9aec59251f35bc321a7d2b09930c372" gracePeriod=2 Oct 03 18:59:55 crc kubenswrapper[4835]: I1003 18:59:55.233727 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:55 crc kubenswrapper[4835]: I1003 18:59:55.652510 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:55 crc kubenswrapper[4835]: I1003 18:59:55.698227 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7blw6"] Oct 03 18:59:55 crc kubenswrapper[4835]: I1003 18:59:55.744205 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a76d39-4dd7-411f-9c40-dc445a9cf4db-catalog-content\") pod \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\" (UID: \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\") " Oct 03 18:59:55 crc kubenswrapper[4835]: I1003 18:59:55.744322 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a76d39-4dd7-411f-9c40-dc445a9cf4db-utilities\") pod \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\" (UID: \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\") " Oct 03 18:59:55 crc kubenswrapper[4835]: I1003 18:59:55.744386 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbcvt\" (UniqueName: \"kubernetes.io/projected/95a76d39-4dd7-411f-9c40-dc445a9cf4db-kube-api-access-sbcvt\") pod \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\" (UID: \"95a76d39-4dd7-411f-9c40-dc445a9cf4db\") " Oct 03 18:59:55 crc kubenswrapper[4835]: I1003 18:59:55.745291 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95a76d39-4dd7-411f-9c40-dc445a9cf4db-utilities" (OuterVolumeSpecName: "utilities") pod "95a76d39-4dd7-411f-9c40-dc445a9cf4db" (UID: "95a76d39-4dd7-411f-9c40-dc445a9cf4db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:59:55 crc kubenswrapper[4835]: I1003 18:59:55.754248 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a76d39-4dd7-411f-9c40-dc445a9cf4db-kube-api-access-sbcvt" (OuterVolumeSpecName: "kube-api-access-sbcvt") pod "95a76d39-4dd7-411f-9c40-dc445a9cf4db" (UID: "95a76d39-4dd7-411f-9c40-dc445a9cf4db"). InnerVolumeSpecName "kube-api-access-sbcvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:59:55 crc kubenswrapper[4835]: I1003 18:59:55.846222 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a76d39-4dd7-411f-9c40-dc445a9cf4db-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:55 crc kubenswrapper[4835]: I1003 18:59:55.846253 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbcvt\" (UniqueName: \"kubernetes.io/projected/95a76d39-4dd7-411f-9c40-dc445a9cf4db-kube-api-access-sbcvt\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.065052 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95a76d39-4dd7-411f-9c40-dc445a9cf4db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95a76d39-4dd7-411f-9c40-dc445a9cf4db" (UID: "95a76d39-4dd7-411f-9c40-dc445a9cf4db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.152360 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a76d39-4dd7-411f-9c40-dc445a9cf4db-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.204818 4835 generic.go:334] "Generic (PLEG): container finished" podID="95a76d39-4dd7-411f-9c40-dc445a9cf4db" containerID="783ddbb3f9f39bdfe0674a14a002ceb5c9aec59251f35bc321a7d2b09930c372" exitCode=0 Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.204935 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvm2t" Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.204924 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvm2t" event={"ID":"95a76d39-4dd7-411f-9c40-dc445a9cf4db","Type":"ContainerDied","Data":"783ddbb3f9f39bdfe0674a14a002ceb5c9aec59251f35bc321a7d2b09930c372"} Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.205002 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvm2t" event={"ID":"95a76d39-4dd7-411f-9c40-dc445a9cf4db","Type":"ContainerDied","Data":"7b2962bb52c7ccc8298367cbb9459817d319b9449471ea88cdccd5f0e8b8ff82"} Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.205028 4835 scope.go:117] "RemoveContainer" containerID="783ddbb3f9f39bdfe0674a14a002ceb5c9aec59251f35bc321a7d2b09930c372" Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.234661 4835 scope.go:117] "RemoveContainer" containerID="b00640ac251577c12d8f5c4207b33385617cc4d27eff1ac37416f168414518d4" Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.240428 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lvm2t"] Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.248994 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lvm2t"] Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.265459 4835 scope.go:117] "RemoveContainer" containerID="971d6c05653d62852c19c602c9b93affcfda6ba6423c5476f151c10d488695dc" Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.306642 4835 scope.go:117] "RemoveContainer" containerID="783ddbb3f9f39bdfe0674a14a002ceb5c9aec59251f35bc321a7d2b09930c372" Oct 03 18:59:56 crc kubenswrapper[4835]: E1003 18:59:56.307006 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"783ddbb3f9f39bdfe0674a14a002ceb5c9aec59251f35bc321a7d2b09930c372\": container with ID starting with 783ddbb3f9f39bdfe0674a14a002ceb5c9aec59251f35bc321a7d2b09930c372 not found: ID does not exist" containerID="783ddbb3f9f39bdfe0674a14a002ceb5c9aec59251f35bc321a7d2b09930c372" Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.307037 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"783ddbb3f9f39bdfe0674a14a002ceb5c9aec59251f35bc321a7d2b09930c372"} err="failed to get container status \"783ddbb3f9f39bdfe0674a14a002ceb5c9aec59251f35bc321a7d2b09930c372\": rpc error: code = NotFound desc = could not find container \"783ddbb3f9f39bdfe0674a14a002ceb5c9aec59251f35bc321a7d2b09930c372\": container with ID starting with 783ddbb3f9f39bdfe0674a14a002ceb5c9aec59251f35bc321a7d2b09930c372 not found: ID does not exist" Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.307057 4835 scope.go:117] "RemoveContainer" containerID="b00640ac251577c12d8f5c4207b33385617cc4d27eff1ac37416f168414518d4" Oct 03 18:59:56 crc kubenswrapper[4835]: E1003 18:59:56.307459 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b00640ac251577c12d8f5c4207b33385617cc4d27eff1ac37416f168414518d4\": container with ID starting with b00640ac251577c12d8f5c4207b33385617cc4d27eff1ac37416f168414518d4 not found: ID does not exist" containerID="b00640ac251577c12d8f5c4207b33385617cc4d27eff1ac37416f168414518d4" Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.307501 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b00640ac251577c12d8f5c4207b33385617cc4d27eff1ac37416f168414518d4"} err="failed to get container status \"b00640ac251577c12d8f5c4207b33385617cc4d27eff1ac37416f168414518d4\": rpc error: code = NotFound desc = could not find container \"b00640ac251577c12d8f5c4207b33385617cc4d27eff1ac37416f168414518d4\": container with ID starting with b00640ac251577c12d8f5c4207b33385617cc4d27eff1ac37416f168414518d4 not found: ID does not exist" Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.307526 4835 scope.go:117] "RemoveContainer" containerID="971d6c05653d62852c19c602c9b93affcfda6ba6423c5476f151c10d488695dc" Oct 03 18:59:56 crc kubenswrapper[4835]: E1003 18:59:56.307791 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971d6c05653d62852c19c602c9b93affcfda6ba6423c5476f151c10d488695dc\": container with ID starting with 971d6c05653d62852c19c602c9b93affcfda6ba6423c5476f151c10d488695dc not found: ID does not exist" containerID="971d6c05653d62852c19c602c9b93affcfda6ba6423c5476f151c10d488695dc" Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.307817 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971d6c05653d62852c19c602c9b93affcfda6ba6423c5476f151c10d488695dc"} err="failed to get container status \"971d6c05653d62852c19c602c9b93affcfda6ba6423c5476f151c10d488695dc\": rpc error: code = NotFound desc = could not find container \"971d6c05653d62852c19c602c9b93affcfda6ba6423c5476f151c10d488695dc\": container with ID starting with 971d6c05653d62852c19c602c9b93affcfda6ba6423c5476f151c10d488695dc not found: ID does not exist" Oct 03 18:59:56 crc kubenswrapper[4835]: I1003 18:59:56.889607 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a76d39-4dd7-411f-9c40-dc445a9cf4db" path="/var/lib/kubelet/pods/95a76d39-4dd7-411f-9c40-dc445a9cf4db/volumes" Oct 03 18:59:57 crc kubenswrapper[4835]: I1003 18:59:57.215878 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7blw6" podUID="4f8518d3-7fd0-45d4-ac23-d063f79aacd8" containerName="registry-server" containerID="cri-o://1e066e32d7dca414590f96df703267515b13acc5c215f401b26fd6d9d082d7fd" gracePeriod=2 Oct 03 18:59:57 crc kubenswrapper[4835]: I1003 18:59:57.676588 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:57 crc kubenswrapper[4835]: I1003 18:59:57.780900 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-catalog-content\") pod \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\" (UID: \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\") " Oct 03 18:59:57 crc kubenswrapper[4835]: I1003 18:59:57.780966 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lksks\" (UniqueName: \"kubernetes.io/projected/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-kube-api-access-lksks\") pod \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\" (UID: \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\") " Oct 03 18:59:57 crc kubenswrapper[4835]: I1003 18:59:57.781062 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-utilities\") pod \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\" (UID: \"4f8518d3-7fd0-45d4-ac23-d063f79aacd8\") " Oct 03 18:59:57 crc kubenswrapper[4835]: I1003 18:59:57.782085 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-utilities" (OuterVolumeSpecName: "utilities") pod "4f8518d3-7fd0-45d4-ac23-d063f79aacd8" (UID: "4f8518d3-7fd0-45d4-ac23-d063f79aacd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:59:57 crc kubenswrapper[4835]: I1003 18:59:57.787053 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-kube-api-access-lksks" (OuterVolumeSpecName: "kube-api-access-lksks") pod "4f8518d3-7fd0-45d4-ac23-d063f79aacd8" (UID: "4f8518d3-7fd0-45d4-ac23-d063f79aacd8"). InnerVolumeSpecName "kube-api-access-lksks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 18:59:57 crc kubenswrapper[4835]: I1003 18:59:57.835904 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f8518d3-7fd0-45d4-ac23-d063f79aacd8" (UID: "4f8518d3-7fd0-45d4-ac23-d063f79aacd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 18:59:57 crc kubenswrapper[4835]: I1003 18:59:57.883579 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:57 crc kubenswrapper[4835]: I1003 18:59:57.883614 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:57 crc kubenswrapper[4835]: I1003 18:59:57.883630 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lksks\" (UniqueName: \"kubernetes.io/projected/4f8518d3-7fd0-45d4-ac23-d063f79aacd8-kube-api-access-lksks\") on node \"crc\" DevicePath \"\"" Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.226584 4835 generic.go:334] "Generic (PLEG): container finished" podID="4f8518d3-7fd0-45d4-ac23-d063f79aacd8" containerID="1e066e32d7dca414590f96df703267515b13acc5c215f401b26fd6d9d082d7fd" exitCode=0 Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.226644 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7blw6" event={"ID":"4f8518d3-7fd0-45d4-ac23-d063f79aacd8","Type":"ContainerDied","Data":"1e066e32d7dca414590f96df703267515b13acc5c215f401b26fd6d9d082d7fd"} Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.226669 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7blw6" Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.226705 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7blw6" event={"ID":"4f8518d3-7fd0-45d4-ac23-d063f79aacd8","Type":"ContainerDied","Data":"5783a7cf50afb809a9d6f3677132dd5ac1a318899070c19924e97811c056012f"} Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.226730 4835 scope.go:117] "RemoveContainer" containerID="1e066e32d7dca414590f96df703267515b13acc5c215f401b26fd6d9d082d7fd" Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.245881 4835 scope.go:117] "RemoveContainer" containerID="ccf14c097b0c918b65ab9e7eafa1ddf151b89ad1dbb08dffe9fa14703de16f95" Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.261004 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7blw6"] Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.268644 4835 scope.go:117] "RemoveContainer" containerID="08db49a5885b66fd4d468d77be3b5acb7cfdedbc41d137703b57674b8bb941fc" Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.272230 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7blw6"] Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.319241 4835 scope.go:117] "RemoveContainer" containerID="1e066e32d7dca414590f96df703267515b13acc5c215f401b26fd6d9d082d7fd" Oct 03 18:59:58 crc kubenswrapper[4835]: E1003 18:59:58.319877 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e066e32d7dca414590f96df703267515b13acc5c215f401b26fd6d9d082d7fd\": container with ID starting with 1e066e32d7dca414590f96df703267515b13acc5c215f401b26fd6d9d082d7fd not found: ID does not exist" containerID="1e066e32d7dca414590f96df703267515b13acc5c215f401b26fd6d9d082d7fd" Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.319960 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e066e32d7dca414590f96df703267515b13acc5c215f401b26fd6d9d082d7fd"} err="failed to get container status \"1e066e32d7dca414590f96df703267515b13acc5c215f401b26fd6d9d082d7fd\": rpc error: code = NotFound desc = could not find container \"1e066e32d7dca414590f96df703267515b13acc5c215f401b26fd6d9d082d7fd\": container with ID starting with 1e066e32d7dca414590f96df703267515b13acc5c215f401b26fd6d9d082d7fd not found: ID does not exist" Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.320003 4835 scope.go:117] "RemoveContainer" containerID="ccf14c097b0c918b65ab9e7eafa1ddf151b89ad1dbb08dffe9fa14703de16f95" Oct 03 18:59:58 crc kubenswrapper[4835]: E1003 18:59:58.320644 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccf14c097b0c918b65ab9e7eafa1ddf151b89ad1dbb08dffe9fa14703de16f95\": container with ID starting with ccf14c097b0c918b65ab9e7eafa1ddf151b89ad1dbb08dffe9fa14703de16f95 not found: ID does not exist" containerID="ccf14c097b0c918b65ab9e7eafa1ddf151b89ad1dbb08dffe9fa14703de16f95" Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.320703 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccf14c097b0c918b65ab9e7eafa1ddf151b89ad1dbb08dffe9fa14703de16f95"} err="failed to get container status \"ccf14c097b0c918b65ab9e7eafa1ddf151b89ad1dbb08dffe9fa14703de16f95\": rpc error: code = NotFound desc = could not find container \"ccf14c097b0c918b65ab9e7eafa1ddf151b89ad1dbb08dffe9fa14703de16f95\": container with ID starting with ccf14c097b0c918b65ab9e7eafa1ddf151b89ad1dbb08dffe9fa14703de16f95 not found: ID does not exist" Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.320743 4835 scope.go:117] "RemoveContainer" containerID="08db49a5885b66fd4d468d77be3b5acb7cfdedbc41d137703b57674b8bb941fc" Oct 03 18:59:58 crc kubenswrapper[4835]: E1003 18:59:58.321190 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08db49a5885b66fd4d468d77be3b5acb7cfdedbc41d137703b57674b8bb941fc\": container with ID starting with 08db49a5885b66fd4d468d77be3b5acb7cfdedbc41d137703b57674b8bb941fc not found: ID does not exist" containerID="08db49a5885b66fd4d468d77be3b5acb7cfdedbc41d137703b57674b8bb941fc" Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.321222 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08db49a5885b66fd4d468d77be3b5acb7cfdedbc41d137703b57674b8bb941fc"} err="failed to get container status \"08db49a5885b66fd4d468d77be3b5acb7cfdedbc41d137703b57674b8bb941fc\": rpc error: code = NotFound desc = could not find container \"08db49a5885b66fd4d468d77be3b5acb7cfdedbc41d137703b57674b8bb941fc\": container with ID starting with 08db49a5885b66fd4d468d77be3b5acb7cfdedbc41d137703b57674b8bb941fc not found: ID does not exist" Oct 03 18:59:58 crc kubenswrapper[4835]: I1003 18:59:58.887999 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f8518d3-7fd0-45d4-ac23-d063f79aacd8" path="/var/lib/kubelet/pods/4f8518d3-7fd0-45d4-ac23-d063f79aacd8/volumes" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.149278 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw"] Oct 03 19:00:00 crc kubenswrapper[4835]: E1003 19:00:00.167054 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8518d3-7fd0-45d4-ac23-d063f79aacd8" containerName="extract-content" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.167408 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8518d3-7fd0-45d4-ac23-d063f79aacd8" containerName="extract-content" Oct 03 19:00:00 crc kubenswrapper[4835]: E1003 19:00:00.167523 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a76d39-4dd7-411f-9c40-dc445a9cf4db" containerName="registry-server" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.167600 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a76d39-4dd7-411f-9c40-dc445a9cf4db" containerName="registry-server" Oct 03 19:00:00 crc kubenswrapper[4835]: E1003 19:00:00.167762 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a76d39-4dd7-411f-9c40-dc445a9cf4db" containerName="extract-utilities" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.167835 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a76d39-4dd7-411f-9c40-dc445a9cf4db" containerName="extract-utilities" Oct 03 19:00:00 crc kubenswrapper[4835]: E1003 19:00:00.167928 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8518d3-7fd0-45d4-ac23-d063f79aacd8" containerName="registry-server" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.167996 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8518d3-7fd0-45d4-ac23-d063f79aacd8" containerName="registry-server" Oct 03 19:00:00 crc kubenswrapper[4835]: E1003 19:00:00.168148 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8518d3-7fd0-45d4-ac23-d063f79aacd8" containerName="extract-utilities" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.168237 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8518d3-7fd0-45d4-ac23-d063f79aacd8" containerName="extract-utilities" Oct 03 19:00:00 crc kubenswrapper[4835]: E1003 19:00:00.168321 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a76d39-4dd7-411f-9c40-dc445a9cf4db" containerName="extract-content" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.168391 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a76d39-4dd7-411f-9c40-dc445a9cf4db" containerName="extract-content" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.169174 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8518d3-7fd0-45d4-ac23-d063f79aacd8" containerName="registry-server" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.169300 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a76d39-4dd7-411f-9c40-dc445a9cf4db" containerName="registry-server" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.173597 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.177940 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.179726 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw"] Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.179581 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.232479 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1519990-5da3-4aa7-84d9-248acce94038-config-volume\") pod \"collect-profiles-29325300-xqclw\" (UID: \"d1519990-5da3-4aa7-84d9-248acce94038\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.232696 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqd26\" (UniqueName: \"kubernetes.io/projected/d1519990-5da3-4aa7-84d9-248acce94038-kube-api-access-jqd26\") pod \"collect-profiles-29325300-xqclw\" (UID: \"d1519990-5da3-4aa7-84d9-248acce94038\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.232747 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1519990-5da3-4aa7-84d9-248acce94038-secret-volume\") pod \"collect-profiles-29325300-xqclw\" (UID: \"d1519990-5da3-4aa7-84d9-248acce94038\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.334881 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1519990-5da3-4aa7-84d9-248acce94038-config-volume\") pod \"collect-profiles-29325300-xqclw\" (UID: \"d1519990-5da3-4aa7-84d9-248acce94038\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.335212 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqd26\" (UniqueName: \"kubernetes.io/projected/d1519990-5da3-4aa7-84d9-248acce94038-kube-api-access-jqd26\") pod \"collect-profiles-29325300-xqclw\" (UID: \"d1519990-5da3-4aa7-84d9-248acce94038\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.335316 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1519990-5da3-4aa7-84d9-248acce94038-secret-volume\") pod \"collect-profiles-29325300-xqclw\" (UID: \"d1519990-5da3-4aa7-84d9-248acce94038\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.335969 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1519990-5da3-4aa7-84d9-248acce94038-config-volume\") pod \"collect-profiles-29325300-xqclw\" (UID: \"d1519990-5da3-4aa7-84d9-248acce94038\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.341596 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1519990-5da3-4aa7-84d9-248acce94038-secret-volume\") pod \"collect-profiles-29325300-xqclw\" (UID: \"d1519990-5da3-4aa7-84d9-248acce94038\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.356743 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqd26\" (UniqueName: \"kubernetes.io/projected/d1519990-5da3-4aa7-84d9-248acce94038-kube-api-access-jqd26\") pod \"collect-profiles-29325300-xqclw\" (UID: \"d1519990-5da3-4aa7-84d9-248acce94038\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.514708 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" Oct 03 19:00:00 crc kubenswrapper[4835]: I1003 19:00:00.973729 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw"] Oct 03 19:00:01 crc kubenswrapper[4835]: I1003 19:00:01.295683 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" event={"ID":"d1519990-5da3-4aa7-84d9-248acce94038","Type":"ContainerStarted","Data":"b562636137c5f39f7a7dcb7187f2f699d6e6f77f353715a278f1b0e0da3f92e4"} Oct 03 19:00:01 crc kubenswrapper[4835]: I1003 19:00:01.295732 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" event={"ID":"d1519990-5da3-4aa7-84d9-248acce94038","Type":"ContainerStarted","Data":"0784fa67eea329765de84e537a079311b456606f6dbfec107114e259123f1230"} Oct 03 19:00:01 crc kubenswrapper[4835]: I1003 19:00:01.315100 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" podStartSLOduration=1.315063991 podStartE2EDuration="1.315063991s" podCreationTimestamp="2025-10-03 19:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 19:00:01.312865237 +0000 UTC m=+2743.028806129" watchObservedRunningTime="2025-10-03 19:00:01.315063991 +0000 UTC m=+2743.031004863" Oct 03 19:00:02 crc kubenswrapper[4835]: I1003 19:00:02.310440 4835 generic.go:334] "Generic (PLEG): container finished" podID="d1519990-5da3-4aa7-84d9-248acce94038" containerID="b562636137c5f39f7a7dcb7187f2f699d6e6f77f353715a278f1b0e0da3f92e4" exitCode=0 Oct 03 19:00:02 crc kubenswrapper[4835]: I1003 19:00:02.310512 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" event={"ID":"d1519990-5da3-4aa7-84d9-248acce94038","Type":"ContainerDied","Data":"b562636137c5f39f7a7dcb7187f2f699d6e6f77f353715a278f1b0e0da3f92e4"} Oct 03 19:00:03 crc kubenswrapper[4835]: I1003 19:00:03.669304 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" Oct 03 19:00:03 crc kubenswrapper[4835]: I1003 19:00:03.802706 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1519990-5da3-4aa7-84d9-248acce94038-secret-volume\") pod \"d1519990-5da3-4aa7-84d9-248acce94038\" (UID: \"d1519990-5da3-4aa7-84d9-248acce94038\") " Oct 03 19:00:03 crc kubenswrapper[4835]: I1003 19:00:03.802837 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqd26\" (UniqueName: \"kubernetes.io/projected/d1519990-5da3-4aa7-84d9-248acce94038-kube-api-access-jqd26\") pod \"d1519990-5da3-4aa7-84d9-248acce94038\" (UID: \"d1519990-5da3-4aa7-84d9-248acce94038\") " Oct 03 19:00:03 crc kubenswrapper[4835]: I1003 19:00:03.802968 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1519990-5da3-4aa7-84d9-248acce94038-config-volume\") pod \"d1519990-5da3-4aa7-84d9-248acce94038\" (UID: \"d1519990-5da3-4aa7-84d9-248acce94038\") " Oct 03 19:00:03 crc kubenswrapper[4835]: I1003 19:00:03.804725 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1519990-5da3-4aa7-84d9-248acce94038-config-volume" (OuterVolumeSpecName: "config-volume") pod "d1519990-5da3-4aa7-84d9-248acce94038" (UID: "d1519990-5da3-4aa7-84d9-248acce94038"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 19:00:03 crc kubenswrapper[4835]: I1003 19:00:03.809825 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1519990-5da3-4aa7-84d9-248acce94038-kube-api-access-jqd26" (OuterVolumeSpecName: "kube-api-access-jqd26") pod "d1519990-5da3-4aa7-84d9-248acce94038" (UID: "d1519990-5da3-4aa7-84d9-248acce94038"). InnerVolumeSpecName "kube-api-access-jqd26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:00:03 crc kubenswrapper[4835]: I1003 19:00:03.810150 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1519990-5da3-4aa7-84d9-248acce94038-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d1519990-5da3-4aa7-84d9-248acce94038" (UID: "d1519990-5da3-4aa7-84d9-248acce94038"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:00:03 crc kubenswrapper[4835]: I1003 19:00:03.906388 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1519990-5da3-4aa7-84d9-248acce94038-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 19:00:03 crc kubenswrapper[4835]: I1003 19:00:03.906443 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqd26\" (UniqueName: \"kubernetes.io/projected/d1519990-5da3-4aa7-84d9-248acce94038-kube-api-access-jqd26\") on node \"crc\" DevicePath \"\"" Oct 03 19:00:03 crc kubenswrapper[4835]: I1003 19:00:03.906456 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1519990-5da3-4aa7-84d9-248acce94038-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 19:00:04 crc kubenswrapper[4835]: I1003 19:00:04.328854 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" event={"ID":"d1519990-5da3-4aa7-84d9-248acce94038","Type":"ContainerDied","Data":"0784fa67eea329765de84e537a079311b456606f6dbfec107114e259123f1230"} Oct 03 19:00:04 crc kubenswrapper[4835]: I1003 19:00:04.328917 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0784fa67eea329765de84e537a079311b456606f6dbfec107114e259123f1230" Oct 03 19:00:04 crc kubenswrapper[4835]: I1003 19:00:04.329008 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw" Oct 03 19:00:04 crc kubenswrapper[4835]: I1003 19:00:04.397420 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2"] Oct 03 19:00:04 crc kubenswrapper[4835]: I1003 19:00:04.405305 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325255-8n5s2"] Oct 03 19:00:04 crc kubenswrapper[4835]: I1003 19:00:04.887973 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb17f76-674e-4cf7-8f87-9af6942bc5c3" path="/var/lib/kubelet/pods/8cb17f76-674e-4cf7-8f87-9af6942bc5c3/volumes" Oct 03 19:00:05 crc kubenswrapper[4835]: I1003 19:00:05.358481 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:00:05 crc kubenswrapper[4835]: I1003 19:00:05.358554 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:00:05 crc kubenswrapper[4835]: I1003 19:00:05.358619 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 19:00:05 crc kubenswrapper[4835]: I1003 19:00:05.359508 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9398b6fcb039b8d8f5e49cb35136b0dc88780d3c5fd88ccc393e7775caa98125"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 19:00:05 crc kubenswrapper[4835]: I1003 19:00:05.359581 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://9398b6fcb039b8d8f5e49cb35136b0dc88780d3c5fd88ccc393e7775caa98125" gracePeriod=600 Oct 03 19:00:06 crc kubenswrapper[4835]: I1003 19:00:06.347591 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="9398b6fcb039b8d8f5e49cb35136b0dc88780d3c5fd88ccc393e7775caa98125" exitCode=0 Oct 03 19:00:06 crc kubenswrapper[4835]: I1003 19:00:06.347676 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"9398b6fcb039b8d8f5e49cb35136b0dc88780d3c5fd88ccc393e7775caa98125"} Oct 03 19:00:06 crc kubenswrapper[4835]: I1003 19:00:06.348230 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f"} Oct 03 19:00:06 crc kubenswrapper[4835]: I1003 19:00:06.348255 4835 scope.go:117] "RemoveContainer" containerID="f67d17a5032473a0d64029ab037546530fbebe826edddbb4cb3f98c7bad060bc" Oct 03 19:00:43 crc kubenswrapper[4835]: I1003 19:00:43.350197 4835 scope.go:117] "RemoveContainer" containerID="943fb918381beb17520dd52bec8c1647113e869e66f2c357d45710ecbb9b300f" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.143825 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29325301-bb85k"] Oct 03 19:01:00 crc kubenswrapper[4835]: E1003 19:01:00.144886 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1519990-5da3-4aa7-84d9-248acce94038" containerName="collect-profiles" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.144905 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1519990-5da3-4aa7-84d9-248acce94038" containerName="collect-profiles" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.145147 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1519990-5da3-4aa7-84d9-248acce94038" containerName="collect-profiles" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.145842 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.178828 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29325301-bb85k"] Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.235468 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqlvd\" (UniqueName: \"kubernetes.io/projected/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-kube-api-access-tqlvd\") pod \"keystone-cron-29325301-bb85k\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.235747 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-fernet-keys\") pod \"keystone-cron-29325301-bb85k\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.235819 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-combined-ca-bundle\") pod \"keystone-cron-29325301-bb85k\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.235867 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-config-data\") pod \"keystone-cron-29325301-bb85k\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.337488 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqlvd\" (UniqueName: \"kubernetes.io/projected/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-kube-api-access-tqlvd\") pod \"keystone-cron-29325301-bb85k\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.337616 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-fernet-keys\") pod \"keystone-cron-29325301-bb85k\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.337651 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-combined-ca-bundle\") pod \"keystone-cron-29325301-bb85k\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.337703 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-config-data\") pod \"keystone-cron-29325301-bb85k\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.344823 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-config-data\") pod \"keystone-cron-29325301-bb85k\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.344864 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-fernet-keys\") pod \"keystone-cron-29325301-bb85k\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.351092 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-combined-ca-bundle\") pod \"keystone-cron-29325301-bb85k\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.357985 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqlvd\" (UniqueName: \"kubernetes.io/projected/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-kube-api-access-tqlvd\") pod \"keystone-cron-29325301-bb85k\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.475452 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:00 crc kubenswrapper[4835]: I1003 19:01:00.962862 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29325301-bb85k"] Oct 03 19:01:01 crc kubenswrapper[4835]: I1003 19:01:01.902697 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325301-bb85k" event={"ID":"99c21b19-0aed-4ab5-9d16-dcfa45e3236c","Type":"ContainerStarted","Data":"e5a9a4422930abe84e1196e6dc54eea2d94bcd58d49f0f3d1f3b1604eb16e40c"} Oct 03 19:01:01 crc kubenswrapper[4835]: I1003 19:01:01.903005 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325301-bb85k" event={"ID":"99c21b19-0aed-4ab5-9d16-dcfa45e3236c","Type":"ContainerStarted","Data":"7969ef1bb816d4c64d41c0df050c4390d4f905f24e42c6c524831a0cb2f64d96"} Oct 03 19:01:01 crc kubenswrapper[4835]: I1003 19:01:01.922027 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29325301-bb85k" podStartSLOduration=1.922005687 podStartE2EDuration="1.922005687s" podCreationTimestamp="2025-10-03 19:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 19:01:01.92051204 +0000 UTC m=+2803.636452922" watchObservedRunningTime="2025-10-03 19:01:01.922005687 +0000 UTC m=+2803.637946559" Oct 03 19:01:04 crc kubenswrapper[4835]: I1003 19:01:04.952321 4835 generic.go:334] "Generic (PLEG): container finished" podID="99c21b19-0aed-4ab5-9d16-dcfa45e3236c" containerID="e5a9a4422930abe84e1196e6dc54eea2d94bcd58d49f0f3d1f3b1604eb16e40c" exitCode=0 Oct 03 19:01:04 crc kubenswrapper[4835]: I1003 19:01:04.952489 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325301-bb85k" event={"ID":"99c21b19-0aed-4ab5-9d16-dcfa45e3236c","Type":"ContainerDied","Data":"e5a9a4422930abe84e1196e6dc54eea2d94bcd58d49f0f3d1f3b1604eb16e40c"} Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.312403 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.355888 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-config-data\") pod \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.356028 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-combined-ca-bundle\") pod \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.356060 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-fernet-keys\") pod \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.356116 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqlvd\" (UniqueName: \"kubernetes.io/projected/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-kube-api-access-tqlvd\") pod \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\" (UID: \"99c21b19-0aed-4ab5-9d16-dcfa45e3236c\") " Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.362255 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-kube-api-access-tqlvd" (OuterVolumeSpecName: "kube-api-access-tqlvd") pod "99c21b19-0aed-4ab5-9d16-dcfa45e3236c" (UID: "99c21b19-0aed-4ab5-9d16-dcfa45e3236c"). InnerVolumeSpecName "kube-api-access-tqlvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.373111 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "99c21b19-0aed-4ab5-9d16-dcfa45e3236c" (UID: "99c21b19-0aed-4ab5-9d16-dcfa45e3236c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.393868 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99c21b19-0aed-4ab5-9d16-dcfa45e3236c" (UID: "99c21b19-0aed-4ab5-9d16-dcfa45e3236c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.418212 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-config-data" (OuterVolumeSpecName: "config-data") pod "99c21b19-0aed-4ab5-9d16-dcfa45e3236c" (UID: "99c21b19-0aed-4ab5-9d16-dcfa45e3236c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.459018 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.459090 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.459107 4835 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.459118 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqlvd\" (UniqueName: \"kubernetes.io/projected/99c21b19-0aed-4ab5-9d16-dcfa45e3236c-kube-api-access-tqlvd\") on node \"crc\" DevicePath \"\"" Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.970730 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29325301-bb85k" event={"ID":"99c21b19-0aed-4ab5-9d16-dcfa45e3236c","Type":"ContainerDied","Data":"7969ef1bb816d4c64d41c0df050c4390d4f905f24e42c6c524831a0cb2f64d96"} Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.970784 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7969ef1bb816d4c64d41c0df050c4390d4f905f24e42c6c524831a0cb2f64d96" Oct 03 19:01:06 crc kubenswrapper[4835]: I1003 19:01:06.970795 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29325301-bb85k" Oct 03 19:01:40 crc kubenswrapper[4835]: I1003 19:01:40.275809 4835 generic.go:334] "Generic (PLEG): container finished" podID="f2406a66-d20f-4ac5-9817-a1bf1ff38c5d" containerID="6d5db5efc55a51cc643e54380fa0c9eea8861d282478139847b36f1996f1b77a" exitCode=0 Oct 03 19:01:40 crc kubenswrapper[4835]: I1003 19:01:40.275877 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" event={"ID":"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d","Type":"ContainerDied","Data":"6d5db5efc55a51cc643e54380fa0c9eea8861d282478139847b36f1996f1b77a"} Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.706452 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.760976 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ssh-key\") pod \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.761034 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-0\") pod \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.761135 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-1\") pod \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.761216 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7vvx\" (UniqueName: \"kubernetes.io/projected/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-kube-api-access-q7vvx\") pod \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.761254 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-inventory\") pod \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.761295 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-telemetry-combined-ca-bundle\") pod \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.761357 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-2\") pod \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\" (UID: \"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d\") " Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.767540 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-kube-api-access-q7vvx" (OuterVolumeSpecName: "kube-api-access-q7vvx") pod "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d" (UID: "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d"). InnerVolumeSpecName "kube-api-access-q7vvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.781754 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d" (UID: "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.792494 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d" (UID: "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.792975 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d" (UID: "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.794965 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d" (UID: "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.797719 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d" (UID: "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.800114 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-inventory" (OuterVolumeSpecName: "inventory") pod "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d" (UID: "f2406a66-d20f-4ac5-9817-a1bf1ff38c5d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.864170 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.864204 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.864221 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.864233 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7vvx\" (UniqueName: \"kubernetes.io/projected/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-kube-api-access-q7vvx\") on node \"crc\" DevicePath \"\"" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.864247 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.864258 4835 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:41.864268 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f2406a66-d20f-4ac5-9817-a1bf1ff38c5d-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:42.293538 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" event={"ID":"f2406a66-d20f-4ac5-9817-a1bf1ff38c5d","Type":"ContainerDied","Data":"7fc77c50fd21328daf1df597f24bff4d20b510d649f57effad834ea19eb252d6"} Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:42.293848 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fc77c50fd21328daf1df597f24bff4d20b510d649f57effad834ea19eb252d6" Oct 03 19:01:42 crc kubenswrapper[4835]: I1003 19:01:42.293607 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp" Oct 03 19:02:05 crc kubenswrapper[4835]: I1003 19:02:05.358777 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:02:05 crc kubenswrapper[4835]: I1003 19:02:05.360034 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:02:18 crc kubenswrapper[4835]: I1003 19:02:18.995544 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 19:02:18 crc kubenswrapper[4835]: I1003 19:02:18.996386 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="prometheus" containerID="cri-o://08b6fbb1fe627ec9a53ec7af83710b0cc79ef5a3e7770f68b29545574299d4c2" gracePeriod=600 Oct 03 19:02:18 crc kubenswrapper[4835]: I1003 19:02:18.996536 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="thanos-sidecar" containerID="cri-o://ebc74297fb9b68d64cde4db4e2b3db9543551fe1ebda3c51b5f172bd8d17caa8" gracePeriod=600 Oct 03 19:02:18 crc kubenswrapper[4835]: I1003 19:02:18.996582 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="config-reloader" containerID="cri-o://a71bcb22bebacf36d00c0d1c06e34c37779006bb91d8db2e7161bf68a5112732" gracePeriod=600 Oct 03 19:02:19 crc kubenswrapper[4835]: E1003 19:02:19.542640 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33e000bd_5fc2_4a35_97f2_5fd3c1493f1c.slice/crio-conmon-08b6fbb1fe627ec9a53ec7af83710b0cc79ef5a3e7770f68b29545574299d4c2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33e000bd_5fc2_4a35_97f2_5fd3c1493f1c.slice/crio-a71bcb22bebacf36d00c0d1c06e34c37779006bb91d8db2e7161bf68a5112732.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33e000bd_5fc2_4a35_97f2_5fd3c1493f1c.slice/crio-conmon-a71bcb22bebacf36d00c0d1c06e34c37779006bb91d8db2e7161bf68a5112732.scope\": RecentStats: unable to find data in memory cache]" Oct 03 19:02:19 crc kubenswrapper[4835]: I1003 19:02:19.677900 4835 generic.go:334] "Generic (PLEG): container finished" podID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerID="ebc74297fb9b68d64cde4db4e2b3db9543551fe1ebda3c51b5f172bd8d17caa8" exitCode=0 Oct 03 19:02:19 crc kubenswrapper[4835]: I1003 19:02:19.678480 4835 generic.go:334] "Generic (PLEG): container finished" podID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerID="a71bcb22bebacf36d00c0d1c06e34c37779006bb91d8db2e7161bf68a5112732" exitCode=0 Oct 03 19:02:19 crc kubenswrapper[4835]: I1003 19:02:19.678495 4835 generic.go:334] "Generic (PLEG): container finished" podID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerID="08b6fbb1fe627ec9a53ec7af83710b0cc79ef5a3e7770f68b29545574299d4c2" exitCode=0 Oct 03 19:02:19 crc kubenswrapper[4835]: I1003 19:02:19.678533 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c","Type":"ContainerDied","Data":"ebc74297fb9b68d64cde4db4e2b3db9543551fe1ebda3c51b5f172bd8d17caa8"} Oct 03 19:02:19 crc kubenswrapper[4835]: I1003 19:02:19.678583 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c","Type":"ContainerDied","Data":"a71bcb22bebacf36d00c0d1c06e34c37779006bb91d8db2e7161bf68a5112732"} Oct 03 19:02:19 crc kubenswrapper[4835]: I1003 19:02:19.678596 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c","Type":"ContainerDied","Data":"08b6fbb1fe627ec9a53ec7af83710b0cc79ef5a3e7770f68b29545574299d4c2"} Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.147236 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.239624 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.239696 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-secret-combined-ca-bundle\") pod \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.239740 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-tls-assets\") pod \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.239802 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-thanos-prometheus-http-client-file\") pod \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.239830 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config\") pod \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.239956 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.240005 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.240027 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-config\") pod \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.240051 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-prometheus-metric-storage-rulefiles-0\") pod \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.240103 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlcm5\" (UniqueName: \"kubernetes.io/projected/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-kube-api-access-hlcm5\") pod \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.240188 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-config-out\") pod \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\" (UID: \"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c\") " Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.244309 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" (UID: "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.247509 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-config-out" (OuterVolumeSpecName: "config-out") pod "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" (UID: "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.250177 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" (UID: "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.250965 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" (UID: "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.255409 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-kube-api-access-hlcm5" (OuterVolumeSpecName: "kube-api-access-hlcm5") pod "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" (UID: "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c"). InnerVolumeSpecName "kube-api-access-hlcm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.255456 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" (UID: "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.259232 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" (UID: "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.263306 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-config" (OuterVolumeSpecName: "config") pod "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" (UID: "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.268153 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" (UID: "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.274795 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" (UID: "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c"). InnerVolumeSpecName "pvc-bed34091-a921-4906-9121-f482ec67e99a". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.342848 4835 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-config-out\") on node \"crc\" DevicePath \"\"" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.342883 4835 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.342895 4835 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.342906 4835 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.342916 4835 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.342941 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") on node \"crc\" " Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.342952 4835 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.342962 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-config\") on node \"crc\" DevicePath \"\"" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.342971 4835 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.342980 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlcm5\" (UniqueName: \"kubernetes.io/projected/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-kube-api-access-hlcm5\") on node \"crc\" DevicePath \"\"" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.378497 4835 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.378664 4835 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-bed34091-a921-4906-9121-f482ec67e99a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a") on node "crc" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.384179 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config" (OuterVolumeSpecName: "web-config") pod "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" (UID: "33e000bd-5fc2-4a35-97f2-5fd3c1493f1c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.444370 4835 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c-web-config\") on node \"crc\" DevicePath \"\"" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.444406 4835 reconciler_common.go:293] "Volume detached for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") on node \"crc\" DevicePath \"\"" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.689857 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"33e000bd-5fc2-4a35-97f2-5fd3c1493f1c","Type":"ContainerDied","Data":"555176beb785054d0479654e50dd5ccdf87d05b9cb0171d60589123f4a69305d"} Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.689913 4835 scope.go:117] "RemoveContainer" containerID="ebc74297fb9b68d64cde4db4e2b3db9543551fe1ebda3c51b5f172bd8d17caa8" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.689933 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.732747 4835 scope.go:117] "RemoveContainer" containerID="a71bcb22bebacf36d00c0d1c06e34c37779006bb91d8db2e7161bf68a5112732" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.733171 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.747483 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.761667 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 19:02:20 crc kubenswrapper[4835]: E1003 19:02:20.762113 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2406a66-d20f-4ac5-9817-a1bf1ff38c5d" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.762128 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2406a66-d20f-4ac5-9817-a1bf1ff38c5d" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 03 19:02:20 crc kubenswrapper[4835]: E1003 19:02:20.762141 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="config-reloader" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.762149 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="config-reloader" Oct 03 19:02:20 crc kubenswrapper[4835]: E1003 19:02:20.762167 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c21b19-0aed-4ab5-9d16-dcfa45e3236c" containerName="keystone-cron" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.762173 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c21b19-0aed-4ab5-9d16-dcfa45e3236c" containerName="keystone-cron" Oct 03 19:02:20 crc kubenswrapper[4835]: E1003 19:02:20.762202 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="prometheus" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.762208 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="prometheus" Oct 03 19:02:20 crc kubenswrapper[4835]: E1003 19:02:20.762226 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="init-config-reloader" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.762232 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="init-config-reloader" Oct 03 19:02:20 crc kubenswrapper[4835]: E1003 19:02:20.762243 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="thanos-sidecar" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.762248 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="thanos-sidecar" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.762356 4835 scope.go:117] "RemoveContainer" containerID="08b6fbb1fe627ec9a53ec7af83710b0cc79ef5a3e7770f68b29545574299d4c2" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.762421 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="config-reloader" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.762431 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c21b19-0aed-4ab5-9d16-dcfa45e3236c" containerName="keystone-cron" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.762444 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2406a66-d20f-4ac5-9817-a1bf1ff38c5d" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.762455 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="prometheus" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.762469 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" containerName="thanos-sidecar" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.765281 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.772108 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.772187 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.773201 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.773217 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.773255 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-9kc6v" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.781803 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.783419 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.810013 4835 scope.go:117] "RemoveContainer" containerID="93b2837e510576d8bb572b89f16db0212e746865c19802cde66953cd8f7661d8" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.852108 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd52q\" (UniqueName: \"kubernetes.io/projected/07f8f72c-80ef-4fd1-a8d7-8167537568d3-kube-api-access-rd52q\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.852213 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.852306 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-config\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.852366 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/07f8f72c-80ef-4fd1-a8d7-8167537568d3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.852436 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.852536 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.852595 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.852620 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/07f8f72c-80ef-4fd1-a8d7-8167537568d3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.852668 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.852732 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.852767 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/07f8f72c-80ef-4fd1-a8d7-8167537568d3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.890298 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33e000bd-5fc2-4a35-97f2-5fd3c1493f1c" path="/var/lib/kubelet/pods/33e000bd-5fc2-4a35-97f2-5fd3c1493f1c/volumes" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.955027 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.955108 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.955136 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/07f8f72c-80ef-4fd1-a8d7-8167537568d3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.955166 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.955203 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/07f8f72c-80ef-4fd1-a8d7-8167537568d3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.955219 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.955272 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd52q\" (UniqueName: \"kubernetes.io/projected/07f8f72c-80ef-4fd1-a8d7-8167537568d3-kube-api-access-rd52q\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.955326 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.955415 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-config\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.955474 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/07f8f72c-80ef-4fd1-a8d7-8167537568d3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.955522 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.957574 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/07f8f72c-80ef-4fd1-a8d7-8167537568d3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.959161 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.959203 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/88bcb9a338b078b35fe2aaa6fcd1ca51c30be9164778c602eed472976adc1b23/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.962195 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.962968 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.963507 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/07f8f72c-80ef-4fd1-a8d7-8167537568d3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.973204 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.973267 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.973415 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/07f8f72c-80ef-4fd1-a8d7-8167537568d3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.973976 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-config\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.974483 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f8f72c-80ef-4fd1-a8d7-8167537568d3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:20 crc kubenswrapper[4835]: I1003 19:02:20.987968 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd52q\" (UniqueName: \"kubernetes.io/projected/07f8f72c-80ef-4fd1-a8d7-8167537568d3-kube-api-access-rd52q\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:21 crc kubenswrapper[4835]: I1003 19:02:21.025238 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bed34091-a921-4906-9121-f482ec67e99a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bed34091-a921-4906-9121-f482ec67e99a\") pod \"prometheus-metric-storage-0\" (UID: \"07f8f72c-80ef-4fd1-a8d7-8167537568d3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:21 crc kubenswrapper[4835]: I1003 19:02:21.093880 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:21 crc kubenswrapper[4835]: I1003 19:02:21.656182 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 19:02:21 crc kubenswrapper[4835]: I1003 19:02:21.719788 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"07f8f72c-80ef-4fd1-a8d7-8167537568d3","Type":"ContainerStarted","Data":"04b58594fbf6476885393c19eff11a563b64fef0a65c001c4ae9b19ea8949a91"} Oct 03 19:02:25 crc kubenswrapper[4835]: I1003 19:02:25.766052 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"07f8f72c-80ef-4fd1-a8d7-8167537568d3","Type":"ContainerStarted","Data":"ac394b137828e47f159685128fd25d243d90c3161388d6e0bb09039c96e1245d"} Oct 03 19:02:32 crc kubenswrapper[4835]: I1003 19:02:32.826458 4835 generic.go:334] "Generic (PLEG): container finished" podID="07f8f72c-80ef-4fd1-a8d7-8167537568d3" containerID="ac394b137828e47f159685128fd25d243d90c3161388d6e0bb09039c96e1245d" exitCode=0 Oct 03 19:02:32 crc kubenswrapper[4835]: I1003 19:02:32.826640 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"07f8f72c-80ef-4fd1-a8d7-8167537568d3","Type":"ContainerDied","Data":"ac394b137828e47f159685128fd25d243d90c3161388d6e0bb09039c96e1245d"} Oct 03 19:02:33 crc kubenswrapper[4835]: I1003 19:02:33.841230 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"07f8f72c-80ef-4fd1-a8d7-8167537568d3","Type":"ContainerStarted","Data":"abdaa8728383e5fb6049efc5b07c9df3443e7b852d1dbf9872d6c8d0aec0f65d"} Oct 03 19:02:35 crc kubenswrapper[4835]: I1003 19:02:35.359948 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:02:35 crc kubenswrapper[4835]: I1003 19:02:35.360483 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:02:37 crc kubenswrapper[4835]: I1003 19:02:37.889667 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"07f8f72c-80ef-4fd1-a8d7-8167537568d3","Type":"ContainerStarted","Data":"f01718b46d2425e4efe7732d5d0e4a99306a961bbe02166a2a2c0a9432b76fbe"} Oct 03 19:02:37 crc kubenswrapper[4835]: I1003 19:02:37.890356 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"07f8f72c-80ef-4fd1-a8d7-8167537568d3","Type":"ContainerStarted","Data":"c9a7e37b81833e1aacc49ca6ef4fa10856dd2016c624915ad31c147df5ffa0be"} Oct 03 19:02:37 crc kubenswrapper[4835]: I1003 19:02:37.924138 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.924112386 podStartE2EDuration="17.924112386s" podCreationTimestamp="2025-10-03 19:02:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 19:02:37.914277513 +0000 UTC m=+2899.630218385" watchObservedRunningTime="2025-10-03 19:02:37.924112386 +0000 UTC m=+2899.640053258" Oct 03 19:02:41 crc kubenswrapper[4835]: I1003 19:02:41.094619 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:51 crc kubenswrapper[4835]: I1003 19:02:51.095533 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:51 crc kubenswrapper[4835]: I1003 19:02:51.101451 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 03 19:02:52 crc kubenswrapper[4835]: I1003 19:02:52.020451 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 03 19:03:05 crc kubenswrapper[4835]: I1003 19:03:05.359040 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:03:05 crc kubenswrapper[4835]: I1003 19:03:05.359633 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:03:05 crc kubenswrapper[4835]: I1003 19:03:05.359680 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 19:03:05 crc kubenswrapper[4835]: I1003 19:03:05.360610 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 19:03:05 crc kubenswrapper[4835]: I1003 19:03:05.360672 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" gracePeriod=600 Oct 03 19:03:06 crc kubenswrapper[4835]: I1003 19:03:06.158261 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" exitCode=0 Oct 03 19:03:06 crc kubenswrapper[4835]: I1003 19:03:06.158417 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f"} Oct 03 19:03:06 crc kubenswrapper[4835]: I1003 19:03:06.158703 4835 scope.go:117] "RemoveContainer" containerID="9398b6fcb039b8d8f5e49cb35136b0dc88780d3c5fd88ccc393e7775caa98125" Oct 03 19:03:06 crc kubenswrapper[4835]: E1003 19:03:06.205130 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:03:07 crc kubenswrapper[4835]: I1003 19:03:07.176535 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:03:07 crc kubenswrapper[4835]: E1003 19:03:07.176936 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.258550 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.260824 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.262504 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.263104 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.263133 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xkg98" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.263632 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.273533 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.348463 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.348964 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4f9d\" (UniqueName: \"kubernetes.io/projected/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-kube-api-access-s4f9d\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.349160 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.349350 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.349570 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-config-data\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.349701 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.349796 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.350032 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.350225 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.452758 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.453058 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.453144 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.453554 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.453621 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.453676 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4f9d\" (UniqueName: \"kubernetes.io/projected/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-kube-api-access-s4f9d\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.453710 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.453753 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.453807 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-config-data\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.453844 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.454584 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.454743 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.456467 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-config-data\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.456562 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.460241 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.464447 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.473034 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.496854 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4f9d\" (UniqueName: \"kubernetes.io/projected/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-kube-api-access-s4f9d\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.505607 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " pod="openstack/tempest-tests-tempest" Oct 03 19:03:20 crc kubenswrapper[4835]: I1003 19:03:20.582598 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 19:03:21 crc kubenswrapper[4835]: I1003 19:03:21.037530 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 03 19:03:21 crc kubenswrapper[4835]: I1003 19:03:21.044111 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 19:03:21 crc kubenswrapper[4835]: I1003 19:03:21.305956 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e","Type":"ContainerStarted","Data":"dd8a57745bb8e2352003022b139b0dd846c99dd4ec56d2ded2923ba735cf35e5"} Oct 03 19:03:21 crc kubenswrapper[4835]: I1003 19:03:21.877717 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:03:21 crc kubenswrapper[4835]: E1003 19:03:21.877980 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:03:35 crc kubenswrapper[4835]: I1003 19:03:35.878607 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:03:35 crc kubenswrapper[4835]: E1003 19:03:35.879964 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:03:46 crc kubenswrapper[4835]: I1003 19:03:46.880184 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:03:46 crc kubenswrapper[4835]: E1003 19:03:46.880963 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:03:49 crc kubenswrapper[4835]: E1003 19:03:49.959339 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-tempest-all:watcher_latest" Oct 03 19:03:49 crc kubenswrapper[4835]: E1003 19:03:49.960313 4835 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.82:5001/podified-master-centos10/openstack-tempest-all:watcher_latest" Oct 03 19:03:49 crc kubenswrapper[4835]: E1003 19:03:49.960499 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:38.102.83.82:5001/podified-master-centos10/openstack-tempest-all:watcher_latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s4f9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(87c3be87-c5ee-4d08-a75d-dfeb16c19d7e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 19:03:49 crc kubenswrapper[4835]: E1003 19:03:49.961646 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" Oct 03 19:03:50 crc kubenswrapper[4835]: E1003 19:03:50.617147 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.82:5001/podified-master-centos10/openstack-tempest-all:watcher_latest\\\"\"" pod="openstack/tempest-tests-tempest" podUID="87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" Oct 03 19:04:00 crc kubenswrapper[4835]: I1003 19:04:00.878770 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:04:00 crc kubenswrapper[4835]: E1003 19:04:00.880591 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:04:05 crc kubenswrapper[4835]: I1003 19:04:05.764344 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e","Type":"ContainerStarted","Data":"ae2950a8851e4b0ebb401782ce8dbd50cd89b920a0a61a84e362dd26c2ff62fd"} Oct 03 19:04:05 crc kubenswrapper[4835]: I1003 19:04:05.797381 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.6976539 podStartE2EDuration="46.797357131s" podCreationTimestamp="2025-10-03 19:03:19 +0000 UTC" firstStartedPulling="2025-10-03 19:03:21.043872642 +0000 UTC m=+2942.759813504" lastFinishedPulling="2025-10-03 19:04:04.143575863 +0000 UTC m=+2985.859516735" observedRunningTime="2025-10-03 19:04:05.790799149 +0000 UTC m=+2987.506740021" watchObservedRunningTime="2025-10-03 19:04:05.797357131 +0000 UTC m=+2987.513298003" Oct 03 19:04:11 crc kubenswrapper[4835]: I1003 19:04:11.877340 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:04:11 crc kubenswrapper[4835]: E1003 19:04:11.878838 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:04:25 crc kubenswrapper[4835]: I1003 19:04:25.878254 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:04:25 crc kubenswrapper[4835]: E1003 19:04:25.879089 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:04:39 crc kubenswrapper[4835]: I1003 19:04:39.876729 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:04:39 crc kubenswrapper[4835]: E1003 19:04:39.877542 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:04:50 crc kubenswrapper[4835]: I1003 19:04:50.877374 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:04:50 crc kubenswrapper[4835]: E1003 19:04:50.878886 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:05:05 crc kubenswrapper[4835]: I1003 19:05:05.877103 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:05:05 crc kubenswrapper[4835]: E1003 19:05:05.877946 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:05:18 crc kubenswrapper[4835]: I1003 19:05:18.893347 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:05:18 crc kubenswrapper[4835]: E1003 19:05:18.895596 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:05:31 crc kubenswrapper[4835]: I1003 19:05:31.877007 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:05:31 crc kubenswrapper[4835]: E1003 19:05:31.877764 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:05:45 crc kubenswrapper[4835]: I1003 19:05:45.876985 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:05:45 crc kubenswrapper[4835]: E1003 19:05:45.878153 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:05:56 crc kubenswrapper[4835]: I1003 19:05:56.877724 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:05:56 crc kubenswrapper[4835]: E1003 19:05:56.878691 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:06:10 crc kubenswrapper[4835]: I1003 19:06:10.877903 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:06:10 crc kubenswrapper[4835]: E1003 19:06:10.878715 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:06:25 crc kubenswrapper[4835]: I1003 19:06:25.877209 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:06:25 crc kubenswrapper[4835]: E1003 19:06:25.878300 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:06:39 crc kubenswrapper[4835]: I1003 19:06:39.877053 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:06:39 crc kubenswrapper[4835]: E1003 19:06:39.877900 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:06:52 crc kubenswrapper[4835]: I1003 19:06:52.877478 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:06:52 crc kubenswrapper[4835]: E1003 19:06:52.878346 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:07:07 crc kubenswrapper[4835]: I1003 19:07:07.877556 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:07:07 crc kubenswrapper[4835]: E1003 19:07:07.878514 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:07:21 crc kubenswrapper[4835]: I1003 19:07:21.876937 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:07:21 crc kubenswrapper[4835]: E1003 19:07:21.877966 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:07:34 crc kubenswrapper[4835]: I1003 19:07:34.878543 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:07:34 crc kubenswrapper[4835]: E1003 19:07:34.880215 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:07:46 crc kubenswrapper[4835]: I1003 19:07:46.877105 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:07:46 crc kubenswrapper[4835]: E1003 19:07:46.877936 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:08:01 crc kubenswrapper[4835]: I1003 19:08:01.877409 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:08:01 crc kubenswrapper[4835]: E1003 19:08:01.878574 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:08:12 crc kubenswrapper[4835]: I1003 19:08:12.877479 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:08:14 crc kubenswrapper[4835]: I1003 19:08:14.197200 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"39ca16f78a66381fda435b1c9590cc632e8fcc1885b5333b2a7eeea08dbe5272"} Oct 03 19:08:31 crc kubenswrapper[4835]: I1003 19:08:31.518864 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-khvwb"] Oct 03 19:08:31 crc kubenswrapper[4835]: I1003 19:08:31.529330 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:31 crc kubenswrapper[4835]: I1003 19:08:31.556809 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-khvwb"] Oct 03 19:08:31 crc kubenswrapper[4835]: I1003 19:08:31.669174 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c6101a-7f77-4c18-9c0d-41510a0b979a-utilities\") pod \"redhat-marketplace-khvwb\" (UID: \"16c6101a-7f77-4c18-9c0d-41510a0b979a\") " pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:31 crc kubenswrapper[4835]: I1003 19:08:31.669377 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs9zs\" (UniqueName: \"kubernetes.io/projected/16c6101a-7f77-4c18-9c0d-41510a0b979a-kube-api-access-rs9zs\") pod \"redhat-marketplace-khvwb\" (UID: \"16c6101a-7f77-4c18-9c0d-41510a0b979a\") " pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:31 crc kubenswrapper[4835]: I1003 19:08:31.669630 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c6101a-7f77-4c18-9c0d-41510a0b979a-catalog-content\") pod \"redhat-marketplace-khvwb\" (UID: \"16c6101a-7f77-4c18-9c0d-41510a0b979a\") " pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:31 crc kubenswrapper[4835]: I1003 19:08:31.771294 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c6101a-7f77-4c18-9c0d-41510a0b979a-utilities\") pod \"redhat-marketplace-khvwb\" (UID: \"16c6101a-7f77-4c18-9c0d-41510a0b979a\") " pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:31 crc kubenswrapper[4835]: I1003 19:08:31.771365 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs9zs\" (UniqueName: \"kubernetes.io/projected/16c6101a-7f77-4c18-9c0d-41510a0b979a-kube-api-access-rs9zs\") pod \"redhat-marketplace-khvwb\" (UID: \"16c6101a-7f77-4c18-9c0d-41510a0b979a\") " pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:31 crc kubenswrapper[4835]: I1003 19:08:31.771430 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c6101a-7f77-4c18-9c0d-41510a0b979a-catalog-content\") pod \"redhat-marketplace-khvwb\" (UID: \"16c6101a-7f77-4c18-9c0d-41510a0b979a\") " pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:31 crc kubenswrapper[4835]: I1003 19:08:31.772213 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c6101a-7f77-4c18-9c0d-41510a0b979a-utilities\") pod \"redhat-marketplace-khvwb\" (UID: \"16c6101a-7f77-4c18-9c0d-41510a0b979a\") " pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:31 crc kubenswrapper[4835]: I1003 19:08:31.772309 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c6101a-7f77-4c18-9c0d-41510a0b979a-catalog-content\") pod \"redhat-marketplace-khvwb\" (UID: \"16c6101a-7f77-4c18-9c0d-41510a0b979a\") " pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:31 crc kubenswrapper[4835]: I1003 19:08:31.815540 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs9zs\" (UniqueName: \"kubernetes.io/projected/16c6101a-7f77-4c18-9c0d-41510a0b979a-kube-api-access-rs9zs\") pod \"redhat-marketplace-khvwb\" (UID: \"16c6101a-7f77-4c18-9c0d-41510a0b979a\") " pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:31 crc kubenswrapper[4835]: I1003 19:08:31.863310 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:32 crc kubenswrapper[4835]: I1003 19:08:32.350531 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-khvwb"] Oct 03 19:08:32 crc kubenswrapper[4835]: I1003 19:08:32.390928 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khvwb" event={"ID":"16c6101a-7f77-4c18-9c0d-41510a0b979a","Type":"ContainerStarted","Data":"d4e79cbffee0c7debb835acb0f8c607d4bf516ee9cf5eb82b8f37481c791f575"} Oct 03 19:08:33 crc kubenswrapper[4835]: I1003 19:08:33.408171 4835 generic.go:334] "Generic (PLEG): container finished" podID="16c6101a-7f77-4c18-9c0d-41510a0b979a" containerID="09f2210d638c1484c02fcdec707c7c5fb9486e853891a12dcd9ad41972ca1b39" exitCode=0 Oct 03 19:08:33 crc kubenswrapper[4835]: I1003 19:08:33.408244 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khvwb" event={"ID":"16c6101a-7f77-4c18-9c0d-41510a0b979a","Type":"ContainerDied","Data":"09f2210d638c1484c02fcdec707c7c5fb9486e853891a12dcd9ad41972ca1b39"} Oct 03 19:08:33 crc kubenswrapper[4835]: I1003 19:08:33.419738 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 19:08:35 crc kubenswrapper[4835]: I1003 19:08:35.446310 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khvwb" event={"ID":"16c6101a-7f77-4c18-9c0d-41510a0b979a","Type":"ContainerStarted","Data":"8733f1f4f8aae830136814d1be1f7334d2be4677e499702f3bf197fe5cc601ef"} Oct 03 19:08:36 crc kubenswrapper[4835]: I1003 19:08:36.458864 4835 generic.go:334] "Generic (PLEG): container finished" podID="16c6101a-7f77-4c18-9c0d-41510a0b979a" containerID="8733f1f4f8aae830136814d1be1f7334d2be4677e499702f3bf197fe5cc601ef" exitCode=0 Oct 03 19:08:36 crc kubenswrapper[4835]: I1003 19:08:36.458963 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khvwb" event={"ID":"16c6101a-7f77-4c18-9c0d-41510a0b979a","Type":"ContainerDied","Data":"8733f1f4f8aae830136814d1be1f7334d2be4677e499702f3bf197fe5cc601ef"} Oct 03 19:08:38 crc kubenswrapper[4835]: I1003 19:08:38.478212 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khvwb" event={"ID":"16c6101a-7f77-4c18-9c0d-41510a0b979a","Type":"ContainerStarted","Data":"41a30a793c5054906db0a126ae458d07fe69dfea4c267bb531bff98ef2639f29"} Oct 03 19:08:38 crc kubenswrapper[4835]: I1003 19:08:38.504307 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-khvwb" podStartSLOduration=3.3483123790000002 podStartE2EDuration="7.504285378s" podCreationTimestamp="2025-10-03 19:08:31 +0000 UTC" firstStartedPulling="2025-10-03 19:08:33.419011299 +0000 UTC m=+3255.134952191" lastFinishedPulling="2025-10-03 19:08:37.574984308 +0000 UTC m=+3259.290925190" observedRunningTime="2025-10-03 19:08:38.496725861 +0000 UTC m=+3260.212666733" watchObservedRunningTime="2025-10-03 19:08:38.504285378 +0000 UTC m=+3260.220226250" Oct 03 19:08:41 crc kubenswrapper[4835]: I1003 19:08:41.864588 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:41 crc kubenswrapper[4835]: I1003 19:08:41.866324 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:41 crc kubenswrapper[4835]: I1003 19:08:41.931337 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:42 crc kubenswrapper[4835]: I1003 19:08:42.581728 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:42 crc kubenswrapper[4835]: I1003 19:08:42.631651 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-khvwb"] Oct 03 19:08:44 crc kubenswrapper[4835]: I1003 19:08:44.555035 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-khvwb" podUID="16c6101a-7f77-4c18-9c0d-41510a0b979a" containerName="registry-server" containerID="cri-o://41a30a793c5054906db0a126ae458d07fe69dfea4c267bb531bff98ef2639f29" gracePeriod=2 Oct 03 19:08:45 crc kubenswrapper[4835]: I1003 19:08:45.574231 4835 generic.go:334] "Generic (PLEG): container finished" podID="16c6101a-7f77-4c18-9c0d-41510a0b979a" containerID="41a30a793c5054906db0a126ae458d07fe69dfea4c267bb531bff98ef2639f29" exitCode=0 Oct 03 19:08:45 crc kubenswrapper[4835]: I1003 19:08:45.574323 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khvwb" event={"ID":"16c6101a-7f77-4c18-9c0d-41510a0b979a","Type":"ContainerDied","Data":"41a30a793c5054906db0a126ae458d07fe69dfea4c267bb531bff98ef2639f29"} Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.105789 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.234422 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs9zs\" (UniqueName: \"kubernetes.io/projected/16c6101a-7f77-4c18-9c0d-41510a0b979a-kube-api-access-rs9zs\") pod \"16c6101a-7f77-4c18-9c0d-41510a0b979a\" (UID: \"16c6101a-7f77-4c18-9c0d-41510a0b979a\") " Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.234489 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c6101a-7f77-4c18-9c0d-41510a0b979a-utilities\") pod \"16c6101a-7f77-4c18-9c0d-41510a0b979a\" (UID: \"16c6101a-7f77-4c18-9c0d-41510a0b979a\") " Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.234536 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c6101a-7f77-4c18-9c0d-41510a0b979a-catalog-content\") pod \"16c6101a-7f77-4c18-9c0d-41510a0b979a\" (UID: \"16c6101a-7f77-4c18-9c0d-41510a0b979a\") " Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.237724 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c6101a-7f77-4c18-9c0d-41510a0b979a-utilities" (OuterVolumeSpecName: "utilities") pod "16c6101a-7f77-4c18-9c0d-41510a0b979a" (UID: "16c6101a-7f77-4c18-9c0d-41510a0b979a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.241536 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16c6101a-7f77-4c18-9c0d-41510a0b979a-kube-api-access-rs9zs" (OuterVolumeSpecName: "kube-api-access-rs9zs") pod "16c6101a-7f77-4c18-9c0d-41510a0b979a" (UID: "16c6101a-7f77-4c18-9c0d-41510a0b979a"). InnerVolumeSpecName "kube-api-access-rs9zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.256442 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c6101a-7f77-4c18-9c0d-41510a0b979a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16c6101a-7f77-4c18-9c0d-41510a0b979a" (UID: "16c6101a-7f77-4c18-9c0d-41510a0b979a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.336824 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c6101a-7f77-4c18-9c0d-41510a0b979a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.336892 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs9zs\" (UniqueName: \"kubernetes.io/projected/16c6101a-7f77-4c18-9c0d-41510a0b979a-kube-api-access-rs9zs\") on node \"crc\" DevicePath \"\"" Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.336906 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c6101a-7f77-4c18-9c0d-41510a0b979a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.585842 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-khvwb" event={"ID":"16c6101a-7f77-4c18-9c0d-41510a0b979a","Type":"ContainerDied","Data":"d4e79cbffee0c7debb835acb0f8c607d4bf516ee9cf5eb82b8f37481c791f575"} Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.585909 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-khvwb" Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.585906 4835 scope.go:117] "RemoveContainer" containerID="41a30a793c5054906db0a126ae458d07fe69dfea4c267bb531bff98ef2639f29" Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.607683 4835 scope.go:117] "RemoveContainer" containerID="8733f1f4f8aae830136814d1be1f7334d2be4677e499702f3bf197fe5cc601ef" Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.631531 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-khvwb"] Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.640294 4835 scope.go:117] "RemoveContainer" containerID="09f2210d638c1484c02fcdec707c7c5fb9486e853891a12dcd9ad41972ca1b39" Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.644087 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-khvwb"] Oct 03 19:08:46 crc kubenswrapper[4835]: I1003 19:08:46.887766 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16c6101a-7f77-4c18-9c0d-41510a0b979a" path="/var/lib/kubelet/pods/16c6101a-7f77-4c18-9c0d-41510a0b979a/volumes" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.050666 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nlppw"] Oct 03 19:10:01 crc kubenswrapper[4835]: E1003 19:10:01.051813 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c6101a-7f77-4c18-9c0d-41510a0b979a" containerName="registry-server" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.051830 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c6101a-7f77-4c18-9c0d-41510a0b979a" containerName="registry-server" Oct 03 19:10:01 crc kubenswrapper[4835]: E1003 19:10:01.051844 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c6101a-7f77-4c18-9c0d-41510a0b979a" containerName="extract-utilities" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.051852 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c6101a-7f77-4c18-9c0d-41510a0b979a" containerName="extract-utilities" Oct 03 19:10:01 crc kubenswrapper[4835]: E1003 19:10:01.051872 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c6101a-7f77-4c18-9c0d-41510a0b979a" containerName="extract-content" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.051879 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c6101a-7f77-4c18-9c0d-41510a0b979a" containerName="extract-content" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.052365 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c6101a-7f77-4c18-9c0d-41510a0b979a" containerName="registry-server" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.054035 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.070209 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nlppw"] Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.143970 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfa5afca-b50e-4111-8e18-a14835de06f9-utilities\") pod \"certified-operators-nlppw\" (UID: \"dfa5afca-b50e-4111-8e18-a14835de06f9\") " pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.144043 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfa5afca-b50e-4111-8e18-a14835de06f9-catalog-content\") pod \"certified-operators-nlppw\" (UID: \"dfa5afca-b50e-4111-8e18-a14835de06f9\") " pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.144083 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkkxb\" (UniqueName: \"kubernetes.io/projected/dfa5afca-b50e-4111-8e18-a14835de06f9-kube-api-access-tkkxb\") pod \"certified-operators-nlppw\" (UID: \"dfa5afca-b50e-4111-8e18-a14835de06f9\") " pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.246477 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfa5afca-b50e-4111-8e18-a14835de06f9-utilities\") pod \"certified-operators-nlppw\" (UID: \"dfa5afca-b50e-4111-8e18-a14835de06f9\") " pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.246559 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfa5afca-b50e-4111-8e18-a14835de06f9-catalog-content\") pod \"certified-operators-nlppw\" (UID: \"dfa5afca-b50e-4111-8e18-a14835de06f9\") " pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.246583 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkkxb\" (UniqueName: \"kubernetes.io/projected/dfa5afca-b50e-4111-8e18-a14835de06f9-kube-api-access-tkkxb\") pod \"certified-operators-nlppw\" (UID: \"dfa5afca-b50e-4111-8e18-a14835de06f9\") " pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.247534 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfa5afca-b50e-4111-8e18-a14835de06f9-catalog-content\") pod \"certified-operators-nlppw\" (UID: \"dfa5afca-b50e-4111-8e18-a14835de06f9\") " pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.247770 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfa5afca-b50e-4111-8e18-a14835de06f9-utilities\") pod \"certified-operators-nlppw\" (UID: \"dfa5afca-b50e-4111-8e18-a14835de06f9\") " pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.272396 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkkxb\" (UniqueName: \"kubernetes.io/projected/dfa5afca-b50e-4111-8e18-a14835de06f9-kube-api-access-tkkxb\") pod \"certified-operators-nlppw\" (UID: \"dfa5afca-b50e-4111-8e18-a14835de06f9\") " pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:01 crc kubenswrapper[4835]: I1003 19:10:01.385208 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:02 crc kubenswrapper[4835]: I1003 19:10:02.117537 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nlppw"] Oct 03 19:10:02 crc kubenswrapper[4835]: I1003 19:10:02.391062 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlppw" event={"ID":"dfa5afca-b50e-4111-8e18-a14835de06f9","Type":"ContainerStarted","Data":"3d3d431b62038d37d69ae0cf3ebf5c21ab79b6fc2f565c48b1b53e71c165d52d"} Oct 03 19:10:02 crc kubenswrapper[4835]: I1003 19:10:02.391625 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlppw" event={"ID":"dfa5afca-b50e-4111-8e18-a14835de06f9","Type":"ContainerStarted","Data":"bb4ad48a108221d7341ae9adb002e50c795f3aef1d3ff8a5954c417d937a0539"} Oct 03 19:10:03 crc kubenswrapper[4835]: I1003 19:10:03.403122 4835 generic.go:334] "Generic (PLEG): container finished" podID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerID="3d3d431b62038d37d69ae0cf3ebf5c21ab79b6fc2f565c48b1b53e71c165d52d" exitCode=0 Oct 03 19:10:03 crc kubenswrapper[4835]: I1003 19:10:03.403174 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlppw" event={"ID":"dfa5afca-b50e-4111-8e18-a14835de06f9","Type":"ContainerDied","Data":"3d3d431b62038d37d69ae0cf3ebf5c21ab79b6fc2f565c48b1b53e71c165d52d"} Oct 03 19:10:04 crc kubenswrapper[4835]: I1003 19:10:04.421105 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlppw" event={"ID":"dfa5afca-b50e-4111-8e18-a14835de06f9","Type":"ContainerStarted","Data":"82fc78a97fffceaac1048c97411affade5ea277da7276e3d6e2c308ce3cb3ebd"} Oct 03 19:10:05 crc kubenswrapper[4835]: I1003 19:10:05.433583 4835 generic.go:334] "Generic (PLEG): container finished" podID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerID="82fc78a97fffceaac1048c97411affade5ea277da7276e3d6e2c308ce3cb3ebd" exitCode=0 Oct 03 19:10:05 crc kubenswrapper[4835]: I1003 19:10:05.433642 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlppw" event={"ID":"dfa5afca-b50e-4111-8e18-a14835de06f9","Type":"ContainerDied","Data":"82fc78a97fffceaac1048c97411affade5ea277da7276e3d6e2c308ce3cb3ebd"} Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.431164 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nftx4"] Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.433802 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.448414 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nftx4"] Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.469543 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlppw" event={"ID":"dfa5afca-b50e-4111-8e18-a14835de06f9","Type":"ContainerStarted","Data":"c92988bb45ba34e1593e3a684d07e75af0b5d051d1f23eec9f1d0ed067fedea9"} Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.494661 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nlppw" podStartSLOduration=1.852010517 podStartE2EDuration="6.49463689s" podCreationTimestamp="2025-10-03 19:10:01 +0000 UTC" firstStartedPulling="2025-10-03 19:10:02.393731635 +0000 UTC m=+3344.109672497" lastFinishedPulling="2025-10-03 19:10:07.036357998 +0000 UTC m=+3348.752298870" observedRunningTime="2025-10-03 19:10:07.488409067 +0000 UTC m=+3349.204349929" watchObservedRunningTime="2025-10-03 19:10:07.49463689 +0000 UTC m=+3349.210577762" Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.588256 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccrsj\" (UniqueName: \"kubernetes.io/projected/b2034453-d610-46b0-acf2-399ca863c110-kube-api-access-ccrsj\") pod \"community-operators-nftx4\" (UID: \"b2034453-d610-46b0-acf2-399ca863c110\") " pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.588450 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2034453-d610-46b0-acf2-399ca863c110-catalog-content\") pod \"community-operators-nftx4\" (UID: \"b2034453-d610-46b0-acf2-399ca863c110\") " pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.588539 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2034453-d610-46b0-acf2-399ca863c110-utilities\") pod \"community-operators-nftx4\" (UID: \"b2034453-d610-46b0-acf2-399ca863c110\") " pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.691111 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2034453-d610-46b0-acf2-399ca863c110-catalog-content\") pod \"community-operators-nftx4\" (UID: \"b2034453-d610-46b0-acf2-399ca863c110\") " pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.691230 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2034453-d610-46b0-acf2-399ca863c110-utilities\") pod \"community-operators-nftx4\" (UID: \"b2034453-d610-46b0-acf2-399ca863c110\") " pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.691370 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccrsj\" (UniqueName: \"kubernetes.io/projected/b2034453-d610-46b0-acf2-399ca863c110-kube-api-access-ccrsj\") pod \"community-operators-nftx4\" (UID: \"b2034453-d610-46b0-acf2-399ca863c110\") " pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.692399 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2034453-d610-46b0-acf2-399ca863c110-catalog-content\") pod \"community-operators-nftx4\" (UID: \"b2034453-d610-46b0-acf2-399ca863c110\") " pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.692734 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2034453-d610-46b0-acf2-399ca863c110-utilities\") pod \"community-operators-nftx4\" (UID: \"b2034453-d610-46b0-acf2-399ca863c110\") " pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.731246 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccrsj\" (UniqueName: \"kubernetes.io/projected/b2034453-d610-46b0-acf2-399ca863c110-kube-api-access-ccrsj\") pod \"community-operators-nftx4\" (UID: \"b2034453-d610-46b0-acf2-399ca863c110\") " pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:07 crc kubenswrapper[4835]: I1003 19:10:07.769486 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:08 crc kubenswrapper[4835]: I1003 19:10:08.337405 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nftx4"] Oct 03 19:10:08 crc kubenswrapper[4835]: W1003 19:10:08.349434 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2034453_d610_46b0_acf2_399ca863c110.slice/crio-b8ae57b62c83d005e2d2e2631223f01ce70d547658ce28fd2606bc455b134af1 WatchSource:0}: Error finding container b8ae57b62c83d005e2d2e2631223f01ce70d547658ce28fd2606bc455b134af1: Status 404 returned error can't find the container with id b8ae57b62c83d005e2d2e2631223f01ce70d547658ce28fd2606bc455b134af1 Oct 03 19:10:08 crc kubenswrapper[4835]: I1003 19:10:08.480285 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nftx4" event={"ID":"b2034453-d610-46b0-acf2-399ca863c110","Type":"ContainerStarted","Data":"b8ae57b62c83d005e2d2e2631223f01ce70d547658ce28fd2606bc455b134af1"} Oct 03 19:10:09 crc kubenswrapper[4835]: I1003 19:10:09.493567 4835 generic.go:334] "Generic (PLEG): container finished" podID="b2034453-d610-46b0-acf2-399ca863c110" containerID="bbabceaaa961434676d1f2b6d0710e84908f28cd0aaeae406ed8889110394b90" exitCode=0 Oct 03 19:10:09 crc kubenswrapper[4835]: I1003 19:10:09.493662 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nftx4" event={"ID":"b2034453-d610-46b0-acf2-399ca863c110","Type":"ContainerDied","Data":"bbabceaaa961434676d1f2b6d0710e84908f28cd0aaeae406ed8889110394b90"} Oct 03 19:10:11 crc kubenswrapper[4835]: I1003 19:10:11.385948 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:11 crc kubenswrapper[4835]: I1003 19:10:11.386280 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:11 crc kubenswrapper[4835]: I1003 19:10:11.519943 4835 generic.go:334] "Generic (PLEG): container finished" podID="b2034453-d610-46b0-acf2-399ca863c110" containerID="ac20b5aa916a42fd6c8a856a60f79d6b416bfd417af374b137993134dd7f0d76" exitCode=0 Oct 03 19:10:11 crc kubenswrapper[4835]: I1003 19:10:11.520008 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nftx4" event={"ID":"b2034453-d610-46b0-acf2-399ca863c110","Type":"ContainerDied","Data":"ac20b5aa916a42fd6c8a856a60f79d6b416bfd417af374b137993134dd7f0d76"} Oct 03 19:10:12 crc kubenswrapper[4835]: I1003 19:10:12.431845 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nlppw" podUID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerName="registry-server" probeResult="failure" output=< Oct 03 19:10:12 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Oct 03 19:10:12 crc kubenswrapper[4835]: > Oct 03 19:10:16 crc kubenswrapper[4835]: I1003 19:10:16.582701 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nftx4" event={"ID":"b2034453-d610-46b0-acf2-399ca863c110","Type":"ContainerStarted","Data":"49b1c8171ea8cd57e9e5f70a373a10ecc6681c2fb3eafd6d314a490e0885b708"} Oct 03 19:10:16 crc kubenswrapper[4835]: I1003 19:10:16.619592 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nftx4" podStartSLOduration=2.9935739999999997 podStartE2EDuration="9.619562083s" podCreationTimestamp="2025-10-03 19:10:07 +0000 UTC" firstStartedPulling="2025-10-03 19:10:09.496325345 +0000 UTC m=+3351.212266227" lastFinishedPulling="2025-10-03 19:10:16.122313438 +0000 UTC m=+3357.838254310" observedRunningTime="2025-10-03 19:10:16.609004283 +0000 UTC m=+3358.324945155" watchObservedRunningTime="2025-10-03 19:10:16.619562083 +0000 UTC m=+3358.335502955" Oct 03 19:10:17 crc kubenswrapper[4835]: I1003 19:10:17.770038 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:17 crc kubenswrapper[4835]: I1003 19:10:17.771431 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:18 crc kubenswrapper[4835]: I1003 19:10:18.834729 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-nftx4" podUID="b2034453-d610-46b0-acf2-399ca863c110" containerName="registry-server" probeResult="failure" output=< Oct 03 19:10:18 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Oct 03 19:10:18 crc kubenswrapper[4835]: > Oct 03 19:10:22 crc kubenswrapper[4835]: I1003 19:10:22.445816 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nlppw" podUID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerName="registry-server" probeResult="failure" output=< Oct 03 19:10:22 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Oct 03 19:10:22 crc kubenswrapper[4835]: > Oct 03 19:10:23 crc kubenswrapper[4835]: I1003 19:10:23.567547 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gq4rb"] Oct 03 19:10:23 crc kubenswrapper[4835]: I1003 19:10:23.571394 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:23 crc kubenswrapper[4835]: I1003 19:10:23.582259 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gq4rb"] Oct 03 19:10:23 crc kubenswrapper[4835]: I1003 19:10:23.634182 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d0dc76-b6b4-419e-b069-32f7eacf15be-utilities\") pod \"redhat-operators-gq4rb\" (UID: \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\") " pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:23 crc kubenswrapper[4835]: I1003 19:10:23.634544 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2vsm\" (UniqueName: \"kubernetes.io/projected/b3d0dc76-b6b4-419e-b069-32f7eacf15be-kube-api-access-t2vsm\") pod \"redhat-operators-gq4rb\" (UID: \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\") " pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:23 crc kubenswrapper[4835]: I1003 19:10:23.634822 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d0dc76-b6b4-419e-b069-32f7eacf15be-catalog-content\") pod \"redhat-operators-gq4rb\" (UID: \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\") " pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:23 crc kubenswrapper[4835]: I1003 19:10:23.738571 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d0dc76-b6b4-419e-b069-32f7eacf15be-utilities\") pod \"redhat-operators-gq4rb\" (UID: \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\") " pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:23 crc kubenswrapper[4835]: I1003 19:10:23.738721 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2vsm\" (UniqueName: \"kubernetes.io/projected/b3d0dc76-b6b4-419e-b069-32f7eacf15be-kube-api-access-t2vsm\") pod \"redhat-operators-gq4rb\" (UID: \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\") " pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:23 crc kubenswrapper[4835]: I1003 19:10:23.738772 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d0dc76-b6b4-419e-b069-32f7eacf15be-catalog-content\") pod \"redhat-operators-gq4rb\" (UID: \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\") " pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:23 crc kubenswrapper[4835]: I1003 19:10:23.739309 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d0dc76-b6b4-419e-b069-32f7eacf15be-catalog-content\") pod \"redhat-operators-gq4rb\" (UID: \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\") " pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:23 crc kubenswrapper[4835]: I1003 19:10:23.739462 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d0dc76-b6b4-419e-b069-32f7eacf15be-utilities\") pod \"redhat-operators-gq4rb\" (UID: \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\") " pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:23 crc kubenswrapper[4835]: I1003 19:10:23.774642 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2vsm\" (UniqueName: \"kubernetes.io/projected/b3d0dc76-b6b4-419e-b069-32f7eacf15be-kube-api-access-t2vsm\") pod \"redhat-operators-gq4rb\" (UID: \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\") " pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:23 crc kubenswrapper[4835]: I1003 19:10:23.902348 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:24 crc kubenswrapper[4835]: I1003 19:10:24.441058 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gq4rb"] Oct 03 19:10:24 crc kubenswrapper[4835]: I1003 19:10:24.668292 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4rb" event={"ID":"b3d0dc76-b6b4-419e-b069-32f7eacf15be","Type":"ContainerStarted","Data":"4942068aab5a35d7bd104877c4abb1288a33b20de65ea0d7cf647a04452a5482"} Oct 03 19:10:25 crc kubenswrapper[4835]: I1003 19:10:25.680206 4835 generic.go:334] "Generic (PLEG): container finished" podID="b3d0dc76-b6b4-419e-b069-32f7eacf15be" containerID="23bec27044c9aa66117bd745713919ffac33bfc7bec7742e0694a8d6c987324d" exitCode=0 Oct 03 19:10:25 crc kubenswrapper[4835]: I1003 19:10:25.680256 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4rb" event={"ID":"b3d0dc76-b6b4-419e-b069-32f7eacf15be","Type":"ContainerDied","Data":"23bec27044c9aa66117bd745713919ffac33bfc7bec7742e0694a8d6c987324d"} Oct 03 19:10:27 crc kubenswrapper[4835]: I1003 19:10:27.706985 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4rb" event={"ID":"b3d0dc76-b6b4-419e-b069-32f7eacf15be","Type":"ContainerStarted","Data":"b9ffae1fc63e93fa3965e50923bf5b517286e6343aa52047e275c12de4325f4a"} Oct 03 19:10:27 crc kubenswrapper[4835]: I1003 19:10:27.826918 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:27 crc kubenswrapper[4835]: I1003 19:10:27.884228 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:28 crc kubenswrapper[4835]: I1003 19:10:28.718271 4835 generic.go:334] "Generic (PLEG): container finished" podID="b3d0dc76-b6b4-419e-b069-32f7eacf15be" containerID="b9ffae1fc63e93fa3965e50923bf5b517286e6343aa52047e275c12de4325f4a" exitCode=0 Oct 03 19:10:28 crc kubenswrapper[4835]: I1003 19:10:28.718380 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4rb" event={"ID":"b3d0dc76-b6b4-419e-b069-32f7eacf15be","Type":"ContainerDied","Data":"b9ffae1fc63e93fa3965e50923bf5b517286e6343aa52047e275c12de4325f4a"} Oct 03 19:10:30 crc kubenswrapper[4835]: I1003 19:10:30.164012 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nftx4"] Oct 03 19:10:30 crc kubenswrapper[4835]: I1003 19:10:30.165439 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nftx4" podUID="b2034453-d610-46b0-acf2-399ca863c110" containerName="registry-server" containerID="cri-o://49b1c8171ea8cd57e9e5f70a373a10ecc6681c2fb3eafd6d314a490e0885b708" gracePeriod=2 Oct 03 19:10:32 crc kubenswrapper[4835]: I1003 19:10:32.431785 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nlppw" podUID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerName="registry-server" probeResult="failure" output=< Oct 03 19:10:32 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Oct 03 19:10:32 crc kubenswrapper[4835]: > Oct 03 19:10:35 crc kubenswrapper[4835]: I1003 19:10:35.358189 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:10:35 crc kubenswrapper[4835]: I1003 19:10:35.358720 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:10:36 crc kubenswrapper[4835]: I1003 19:10:36.885993 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:36 crc kubenswrapper[4835]: I1003 19:10:36.956165 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2034453-d610-46b0-acf2-399ca863c110-utilities\") pod \"b2034453-d610-46b0-acf2-399ca863c110\" (UID: \"b2034453-d610-46b0-acf2-399ca863c110\") " Oct 03 19:10:36 crc kubenswrapper[4835]: I1003 19:10:36.956265 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2034453-d610-46b0-acf2-399ca863c110-catalog-content\") pod \"b2034453-d610-46b0-acf2-399ca863c110\" (UID: \"b2034453-d610-46b0-acf2-399ca863c110\") " Oct 03 19:10:36 crc kubenswrapper[4835]: I1003 19:10:36.956334 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccrsj\" (UniqueName: \"kubernetes.io/projected/b2034453-d610-46b0-acf2-399ca863c110-kube-api-access-ccrsj\") pod \"b2034453-d610-46b0-acf2-399ca863c110\" (UID: \"b2034453-d610-46b0-acf2-399ca863c110\") " Oct 03 19:10:36 crc kubenswrapper[4835]: I1003 19:10:36.957367 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2034453-d610-46b0-acf2-399ca863c110-utilities" (OuterVolumeSpecName: "utilities") pod "b2034453-d610-46b0-acf2-399ca863c110" (UID: "b2034453-d610-46b0-acf2-399ca863c110"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:10:36 crc kubenswrapper[4835]: I1003 19:10:36.987687 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2034453-d610-46b0-acf2-399ca863c110-kube-api-access-ccrsj" (OuterVolumeSpecName: "kube-api-access-ccrsj") pod "b2034453-d610-46b0-acf2-399ca863c110" (UID: "b2034453-d610-46b0-acf2-399ca863c110"). InnerVolumeSpecName "kube-api-access-ccrsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:10:37 crc kubenswrapper[4835]: I1003 19:10:37.000193 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2034453-d610-46b0-acf2-399ca863c110-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2034453-d610-46b0-acf2-399ca863c110" (UID: "b2034453-d610-46b0-acf2-399ca863c110"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:10:37 crc kubenswrapper[4835]: I1003 19:10:37.059540 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2034453-d610-46b0-acf2-399ca863c110-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:10:37 crc kubenswrapper[4835]: I1003 19:10:37.059586 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2034453-d610-46b0-acf2-399ca863c110-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:10:37 crc kubenswrapper[4835]: I1003 19:10:37.059600 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccrsj\" (UniqueName: \"kubernetes.io/projected/b2034453-d610-46b0-acf2-399ca863c110-kube-api-access-ccrsj\") on node \"crc\" DevicePath \"\"" Oct 03 19:10:37 crc kubenswrapper[4835]: I1003 19:10:37.474630 4835 generic.go:334] "Generic (PLEG): container finished" podID="b2034453-d610-46b0-acf2-399ca863c110" containerID="49b1c8171ea8cd57e9e5f70a373a10ecc6681c2fb3eafd6d314a490e0885b708" exitCode=0 Oct 03 19:10:37 crc kubenswrapper[4835]: I1003 19:10:37.474715 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nftx4" event={"ID":"b2034453-d610-46b0-acf2-399ca863c110","Type":"ContainerDied","Data":"49b1c8171ea8cd57e9e5f70a373a10ecc6681c2fb3eafd6d314a490e0885b708"} Oct 03 19:10:37 crc kubenswrapper[4835]: I1003 19:10:37.475277 4835 scope.go:117] "RemoveContainer" containerID="49b1c8171ea8cd57e9e5f70a373a10ecc6681c2fb3eafd6d314a490e0885b708" Oct 03 19:10:37 crc kubenswrapper[4835]: I1003 19:10:37.506139 4835 scope.go:117] "RemoveContainer" containerID="ac20b5aa916a42fd6c8a856a60f79d6b416bfd417af374b137993134dd7f0d76" Oct 03 19:10:37 crc kubenswrapper[4835]: I1003 19:10:37.528487 4835 scope.go:117] "RemoveContainer" containerID="bbabceaaa961434676d1f2b6d0710e84908f28cd0aaeae406ed8889110394b90" Oct 03 19:10:38 crc kubenswrapper[4835]: I1003 19:10:38.503574 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4rb" event={"ID":"b3d0dc76-b6b4-419e-b069-32f7eacf15be","Type":"ContainerStarted","Data":"4a787589fd7994a40c069145b0716d88108dd6801ff72089bc23cf6ccc0e466b"} Oct 03 19:10:38 crc kubenswrapper[4835]: I1003 19:10:38.508253 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nftx4" event={"ID":"b2034453-d610-46b0-acf2-399ca863c110","Type":"ContainerDied","Data":"b8ae57b62c83d005e2d2e2631223f01ce70d547658ce28fd2606bc455b134af1"} Oct 03 19:10:38 crc kubenswrapper[4835]: I1003 19:10:38.508287 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nftx4" Oct 03 19:10:38 crc kubenswrapper[4835]: I1003 19:10:38.533149 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gq4rb" podStartSLOduration=4.204417346 podStartE2EDuration="15.533022968s" podCreationTimestamp="2025-10-03 19:10:23 +0000 UTC" firstStartedPulling="2025-10-03 19:10:25.682408083 +0000 UTC m=+3367.398348955" lastFinishedPulling="2025-10-03 19:10:37.011013705 +0000 UTC m=+3378.726954577" observedRunningTime="2025-10-03 19:10:38.521237478 +0000 UTC m=+3380.237178370" watchObservedRunningTime="2025-10-03 19:10:38.533022968 +0000 UTC m=+3380.248963830" Oct 03 19:10:38 crc kubenswrapper[4835]: I1003 19:10:38.560956 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nftx4"] Oct 03 19:10:38 crc kubenswrapper[4835]: I1003 19:10:38.571516 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nftx4"] Oct 03 19:10:38 crc kubenswrapper[4835]: I1003 19:10:38.890576 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2034453-d610-46b0-acf2-399ca863c110" path="/var/lib/kubelet/pods/b2034453-d610-46b0-acf2-399ca863c110/volumes" Oct 03 19:10:42 crc kubenswrapper[4835]: I1003 19:10:42.434899 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nlppw" podUID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerName="registry-server" probeResult="failure" output=< Oct 03 19:10:42 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Oct 03 19:10:42 crc kubenswrapper[4835]: > Oct 03 19:10:43 crc kubenswrapper[4835]: I1003 19:10:43.903243 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:43 crc kubenswrapper[4835]: I1003 19:10:43.903611 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:43 crc kubenswrapper[4835]: I1003 19:10:43.957911 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:44 crc kubenswrapper[4835]: I1003 19:10:44.622539 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:44 crc kubenswrapper[4835]: I1003 19:10:44.678530 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gq4rb"] Oct 03 19:10:46 crc kubenswrapper[4835]: I1003 19:10:46.596478 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gq4rb" podUID="b3d0dc76-b6b4-419e-b069-32f7eacf15be" containerName="registry-server" containerID="cri-o://4a787589fd7994a40c069145b0716d88108dd6801ff72089bc23cf6ccc0e466b" gracePeriod=2 Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.167985 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.298248 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d0dc76-b6b4-419e-b069-32f7eacf15be-catalog-content\") pod \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\" (UID: \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\") " Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.298348 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2vsm\" (UniqueName: \"kubernetes.io/projected/b3d0dc76-b6b4-419e-b069-32f7eacf15be-kube-api-access-t2vsm\") pod \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\" (UID: \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\") " Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.298457 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d0dc76-b6b4-419e-b069-32f7eacf15be-utilities\") pod \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\" (UID: \"b3d0dc76-b6b4-419e-b069-32f7eacf15be\") " Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.300026 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d0dc76-b6b4-419e-b069-32f7eacf15be-utilities" (OuterVolumeSpecName: "utilities") pod "b3d0dc76-b6b4-419e-b069-32f7eacf15be" (UID: "b3d0dc76-b6b4-419e-b069-32f7eacf15be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.307568 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d0dc76-b6b4-419e-b069-32f7eacf15be-kube-api-access-t2vsm" (OuterVolumeSpecName: "kube-api-access-t2vsm") pod "b3d0dc76-b6b4-419e-b069-32f7eacf15be" (UID: "b3d0dc76-b6b4-419e-b069-32f7eacf15be"). InnerVolumeSpecName "kube-api-access-t2vsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.401135 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d0dc76-b6b4-419e-b069-32f7eacf15be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3d0dc76-b6b4-419e-b069-32f7eacf15be" (UID: "b3d0dc76-b6b4-419e-b069-32f7eacf15be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.401329 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2vsm\" (UniqueName: \"kubernetes.io/projected/b3d0dc76-b6b4-419e-b069-32f7eacf15be-kube-api-access-t2vsm\") on node \"crc\" DevicePath \"\"" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.401865 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d0dc76-b6b4-419e-b069-32f7eacf15be-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.503920 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d0dc76-b6b4-419e-b069-32f7eacf15be-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.627253 4835 generic.go:334] "Generic (PLEG): container finished" podID="b3d0dc76-b6b4-419e-b069-32f7eacf15be" containerID="4a787589fd7994a40c069145b0716d88108dd6801ff72089bc23cf6ccc0e466b" exitCode=0 Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.627320 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4rb" event={"ID":"b3d0dc76-b6b4-419e-b069-32f7eacf15be","Type":"ContainerDied","Data":"4a787589fd7994a40c069145b0716d88108dd6801ff72089bc23cf6ccc0e466b"} Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.627379 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4rb" event={"ID":"b3d0dc76-b6b4-419e-b069-32f7eacf15be","Type":"ContainerDied","Data":"4942068aab5a35d7bd104877c4abb1288a33b20de65ea0d7cf647a04452a5482"} Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.627388 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq4rb" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.627398 4835 scope.go:117] "RemoveContainer" containerID="4a787589fd7994a40c069145b0716d88108dd6801ff72089bc23cf6ccc0e466b" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.670033 4835 scope.go:117] "RemoveContainer" containerID="b9ffae1fc63e93fa3965e50923bf5b517286e6343aa52047e275c12de4325f4a" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.681730 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gq4rb"] Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.695357 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gq4rb"] Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.720632 4835 scope.go:117] "RemoveContainer" containerID="23bec27044c9aa66117bd745713919ffac33bfc7bec7742e0694a8d6c987324d" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.760812 4835 scope.go:117] "RemoveContainer" containerID="4a787589fd7994a40c069145b0716d88108dd6801ff72089bc23cf6ccc0e466b" Oct 03 19:10:47 crc kubenswrapper[4835]: E1003 19:10:47.761974 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a787589fd7994a40c069145b0716d88108dd6801ff72089bc23cf6ccc0e466b\": container with ID starting with 4a787589fd7994a40c069145b0716d88108dd6801ff72089bc23cf6ccc0e466b not found: ID does not exist" containerID="4a787589fd7994a40c069145b0716d88108dd6801ff72089bc23cf6ccc0e466b" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.762019 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a787589fd7994a40c069145b0716d88108dd6801ff72089bc23cf6ccc0e466b"} err="failed to get container status \"4a787589fd7994a40c069145b0716d88108dd6801ff72089bc23cf6ccc0e466b\": rpc error: code = NotFound desc = could not find container \"4a787589fd7994a40c069145b0716d88108dd6801ff72089bc23cf6ccc0e466b\": container with ID starting with 4a787589fd7994a40c069145b0716d88108dd6801ff72089bc23cf6ccc0e466b not found: ID does not exist" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.762048 4835 scope.go:117] "RemoveContainer" containerID="b9ffae1fc63e93fa3965e50923bf5b517286e6343aa52047e275c12de4325f4a" Oct 03 19:10:47 crc kubenswrapper[4835]: E1003 19:10:47.762466 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ffae1fc63e93fa3965e50923bf5b517286e6343aa52047e275c12de4325f4a\": container with ID starting with b9ffae1fc63e93fa3965e50923bf5b517286e6343aa52047e275c12de4325f4a not found: ID does not exist" containerID="b9ffae1fc63e93fa3965e50923bf5b517286e6343aa52047e275c12de4325f4a" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.762504 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ffae1fc63e93fa3965e50923bf5b517286e6343aa52047e275c12de4325f4a"} err="failed to get container status \"b9ffae1fc63e93fa3965e50923bf5b517286e6343aa52047e275c12de4325f4a\": rpc error: code = NotFound desc = could not find container \"b9ffae1fc63e93fa3965e50923bf5b517286e6343aa52047e275c12de4325f4a\": container with ID starting with b9ffae1fc63e93fa3965e50923bf5b517286e6343aa52047e275c12de4325f4a not found: ID does not exist" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.762536 4835 scope.go:117] "RemoveContainer" containerID="23bec27044c9aa66117bd745713919ffac33bfc7bec7742e0694a8d6c987324d" Oct 03 19:10:47 crc kubenswrapper[4835]: E1003 19:10:47.763101 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23bec27044c9aa66117bd745713919ffac33bfc7bec7742e0694a8d6c987324d\": container with ID starting with 23bec27044c9aa66117bd745713919ffac33bfc7bec7742e0694a8d6c987324d not found: ID does not exist" containerID="23bec27044c9aa66117bd745713919ffac33bfc7bec7742e0694a8d6c987324d" Oct 03 19:10:47 crc kubenswrapper[4835]: I1003 19:10:47.763125 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23bec27044c9aa66117bd745713919ffac33bfc7bec7742e0694a8d6c987324d"} err="failed to get container status \"23bec27044c9aa66117bd745713919ffac33bfc7bec7742e0694a8d6c987324d\": rpc error: code = NotFound desc = could not find container \"23bec27044c9aa66117bd745713919ffac33bfc7bec7742e0694a8d6c987324d\": container with ID starting with 23bec27044c9aa66117bd745713919ffac33bfc7bec7742e0694a8d6c987324d not found: ID does not exist" Oct 03 19:10:48 crc kubenswrapper[4835]: I1003 19:10:48.889982 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d0dc76-b6b4-419e-b069-32f7eacf15be" path="/var/lib/kubelet/pods/b3d0dc76-b6b4-419e-b069-32f7eacf15be/volumes" Oct 03 19:10:51 crc kubenswrapper[4835]: I1003 19:10:51.434015 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:51 crc kubenswrapper[4835]: I1003 19:10:51.490608 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:51 crc kubenswrapper[4835]: I1003 19:10:51.673723 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nlppw"] Oct 03 19:10:52 crc kubenswrapper[4835]: I1003 19:10:52.680700 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nlppw" podUID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerName="registry-server" containerID="cri-o://c92988bb45ba34e1593e3a684d07e75af0b5d051d1f23eec9f1d0ed067fedea9" gracePeriod=2 Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.668629 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.693884 4835 generic.go:334] "Generic (PLEG): container finished" podID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerID="c92988bb45ba34e1593e3a684d07e75af0b5d051d1f23eec9f1d0ed067fedea9" exitCode=0 Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.693931 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlppw" event={"ID":"dfa5afca-b50e-4111-8e18-a14835de06f9","Type":"ContainerDied","Data":"c92988bb45ba34e1593e3a684d07e75af0b5d051d1f23eec9f1d0ed067fedea9"} Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.693958 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nlppw" event={"ID":"dfa5afca-b50e-4111-8e18-a14835de06f9","Type":"ContainerDied","Data":"bb4ad48a108221d7341ae9adb002e50c795f3aef1d3ff8a5954c417d937a0539"} Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.693975 4835 scope.go:117] "RemoveContainer" containerID="c92988bb45ba34e1593e3a684d07e75af0b5d051d1f23eec9f1d0ed067fedea9" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.694157 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nlppw" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.740371 4835 scope.go:117] "RemoveContainer" containerID="82fc78a97fffceaac1048c97411affade5ea277da7276e3d6e2c308ce3cb3ebd" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.749590 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfa5afca-b50e-4111-8e18-a14835de06f9-catalog-content\") pod \"dfa5afca-b50e-4111-8e18-a14835de06f9\" (UID: \"dfa5afca-b50e-4111-8e18-a14835de06f9\") " Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.749924 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfa5afca-b50e-4111-8e18-a14835de06f9-utilities\") pod \"dfa5afca-b50e-4111-8e18-a14835de06f9\" (UID: \"dfa5afca-b50e-4111-8e18-a14835de06f9\") " Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.749982 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkkxb\" (UniqueName: \"kubernetes.io/projected/dfa5afca-b50e-4111-8e18-a14835de06f9-kube-api-access-tkkxb\") pod \"dfa5afca-b50e-4111-8e18-a14835de06f9\" (UID: \"dfa5afca-b50e-4111-8e18-a14835de06f9\") " Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.750427 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfa5afca-b50e-4111-8e18-a14835de06f9-utilities" (OuterVolumeSpecName: "utilities") pod "dfa5afca-b50e-4111-8e18-a14835de06f9" (UID: "dfa5afca-b50e-4111-8e18-a14835de06f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.751050 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfa5afca-b50e-4111-8e18-a14835de06f9-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.758192 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa5afca-b50e-4111-8e18-a14835de06f9-kube-api-access-tkkxb" (OuterVolumeSpecName: "kube-api-access-tkkxb") pod "dfa5afca-b50e-4111-8e18-a14835de06f9" (UID: "dfa5afca-b50e-4111-8e18-a14835de06f9"). InnerVolumeSpecName "kube-api-access-tkkxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.777618 4835 scope.go:117] "RemoveContainer" containerID="3d3d431b62038d37d69ae0cf3ebf5c21ab79b6fc2f565c48b1b53e71c165d52d" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.798434 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfa5afca-b50e-4111-8e18-a14835de06f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dfa5afca-b50e-4111-8e18-a14835de06f9" (UID: "dfa5afca-b50e-4111-8e18-a14835de06f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.853285 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfa5afca-b50e-4111-8e18-a14835de06f9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.853324 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkkxb\" (UniqueName: \"kubernetes.io/projected/dfa5afca-b50e-4111-8e18-a14835de06f9-kube-api-access-tkkxb\") on node \"crc\" DevicePath \"\"" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.864260 4835 scope.go:117] "RemoveContainer" containerID="c92988bb45ba34e1593e3a684d07e75af0b5d051d1f23eec9f1d0ed067fedea9" Oct 03 19:10:53 crc kubenswrapper[4835]: E1003 19:10:53.865982 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c92988bb45ba34e1593e3a684d07e75af0b5d051d1f23eec9f1d0ed067fedea9\": container with ID starting with c92988bb45ba34e1593e3a684d07e75af0b5d051d1f23eec9f1d0ed067fedea9 not found: ID does not exist" containerID="c92988bb45ba34e1593e3a684d07e75af0b5d051d1f23eec9f1d0ed067fedea9" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.866018 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c92988bb45ba34e1593e3a684d07e75af0b5d051d1f23eec9f1d0ed067fedea9"} err="failed to get container status \"c92988bb45ba34e1593e3a684d07e75af0b5d051d1f23eec9f1d0ed067fedea9\": rpc error: code = NotFound desc = could not find container \"c92988bb45ba34e1593e3a684d07e75af0b5d051d1f23eec9f1d0ed067fedea9\": container with ID starting with c92988bb45ba34e1593e3a684d07e75af0b5d051d1f23eec9f1d0ed067fedea9 not found: ID does not exist" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.866044 4835 scope.go:117] "RemoveContainer" containerID="82fc78a97fffceaac1048c97411affade5ea277da7276e3d6e2c308ce3cb3ebd" Oct 03 19:10:53 crc kubenswrapper[4835]: E1003 19:10:53.866517 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82fc78a97fffceaac1048c97411affade5ea277da7276e3d6e2c308ce3cb3ebd\": container with ID starting with 82fc78a97fffceaac1048c97411affade5ea277da7276e3d6e2c308ce3cb3ebd not found: ID does not exist" containerID="82fc78a97fffceaac1048c97411affade5ea277da7276e3d6e2c308ce3cb3ebd" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.866631 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82fc78a97fffceaac1048c97411affade5ea277da7276e3d6e2c308ce3cb3ebd"} err="failed to get container status \"82fc78a97fffceaac1048c97411affade5ea277da7276e3d6e2c308ce3cb3ebd\": rpc error: code = NotFound desc = could not find container \"82fc78a97fffceaac1048c97411affade5ea277da7276e3d6e2c308ce3cb3ebd\": container with ID starting with 82fc78a97fffceaac1048c97411affade5ea277da7276e3d6e2c308ce3cb3ebd not found: ID does not exist" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.866726 4835 scope.go:117] "RemoveContainer" containerID="3d3d431b62038d37d69ae0cf3ebf5c21ab79b6fc2f565c48b1b53e71c165d52d" Oct 03 19:10:53 crc kubenswrapper[4835]: E1003 19:10:53.867118 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d3d431b62038d37d69ae0cf3ebf5c21ab79b6fc2f565c48b1b53e71c165d52d\": container with ID starting with 3d3d431b62038d37d69ae0cf3ebf5c21ab79b6fc2f565c48b1b53e71c165d52d not found: ID does not exist" containerID="3d3d431b62038d37d69ae0cf3ebf5c21ab79b6fc2f565c48b1b53e71c165d52d" Oct 03 19:10:53 crc kubenswrapper[4835]: I1003 19:10:53.867160 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d3d431b62038d37d69ae0cf3ebf5c21ab79b6fc2f565c48b1b53e71c165d52d"} err="failed to get container status \"3d3d431b62038d37d69ae0cf3ebf5c21ab79b6fc2f565c48b1b53e71c165d52d\": rpc error: code = NotFound desc = could not find container \"3d3d431b62038d37d69ae0cf3ebf5c21ab79b6fc2f565c48b1b53e71c165d52d\": container with ID starting with 3d3d431b62038d37d69ae0cf3ebf5c21ab79b6fc2f565c48b1b53e71c165d52d not found: ID does not exist" Oct 03 19:10:54 crc kubenswrapper[4835]: I1003 19:10:54.031343 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nlppw"] Oct 03 19:10:54 crc kubenswrapper[4835]: I1003 19:10:54.040404 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nlppw"] Oct 03 19:10:54 crc kubenswrapper[4835]: I1003 19:10:54.892183 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfa5afca-b50e-4111-8e18-a14835de06f9" path="/var/lib/kubelet/pods/dfa5afca-b50e-4111-8e18-a14835de06f9/volumes" Oct 03 19:11:05 crc kubenswrapper[4835]: I1003 19:11:05.358818 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:11:05 crc kubenswrapper[4835]: I1003 19:11:05.359698 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:11:35 crc kubenswrapper[4835]: I1003 19:11:35.359191 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:11:35 crc kubenswrapper[4835]: I1003 19:11:35.360255 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:11:35 crc kubenswrapper[4835]: I1003 19:11:35.360349 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 19:11:35 crc kubenswrapper[4835]: I1003 19:11:35.361492 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39ca16f78a66381fda435b1c9590cc632e8fcc1885b5333b2a7eeea08dbe5272"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 19:11:35 crc kubenswrapper[4835]: I1003 19:11:35.361611 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://39ca16f78a66381fda435b1c9590cc632e8fcc1885b5333b2a7eeea08dbe5272" gracePeriod=600 Oct 03 19:11:36 crc kubenswrapper[4835]: I1003 19:11:36.154943 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="39ca16f78a66381fda435b1c9590cc632e8fcc1885b5333b2a7eeea08dbe5272" exitCode=0 Oct 03 19:11:36 crc kubenswrapper[4835]: I1003 19:11:36.155022 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"39ca16f78a66381fda435b1c9590cc632e8fcc1885b5333b2a7eeea08dbe5272"} Oct 03 19:11:36 crc kubenswrapper[4835]: I1003 19:11:36.155429 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb"} Oct 03 19:11:36 crc kubenswrapper[4835]: I1003 19:11:36.155462 4835 scope.go:117] "RemoveContainer" containerID="777be69ac9d7a96a6271d2d00de7d1b6ca05e7264ee977adc2461ae72e1c054f" Oct 03 19:13:35 crc kubenswrapper[4835]: I1003 19:13:35.358311 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:13:35 crc kubenswrapper[4835]: I1003 19:13:35.359427 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:14:05 crc kubenswrapper[4835]: I1003 19:14:05.358771 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:14:05 crc kubenswrapper[4835]: I1003 19:14:05.359336 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:14:35 crc kubenswrapper[4835]: I1003 19:14:35.358687 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:14:35 crc kubenswrapper[4835]: I1003 19:14:35.359308 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:14:35 crc kubenswrapper[4835]: I1003 19:14:35.359369 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 19:14:35 crc kubenswrapper[4835]: I1003 19:14:35.360344 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 19:14:35 crc kubenswrapper[4835]: I1003 19:14:35.360405 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" gracePeriod=600 Oct 03 19:14:35 crc kubenswrapper[4835]: E1003 19:14:35.495323 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:14:36 crc kubenswrapper[4835]: I1003 19:14:36.079504 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" exitCode=0 Oct 03 19:14:36 crc kubenswrapper[4835]: I1003 19:14:36.079571 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb"} Oct 03 19:14:36 crc kubenswrapper[4835]: I1003 19:14:36.079623 4835 scope.go:117] "RemoveContainer" containerID="39ca16f78a66381fda435b1c9590cc632e8fcc1885b5333b2a7eeea08dbe5272" Oct 03 19:14:36 crc kubenswrapper[4835]: I1003 19:14:36.080492 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:14:36 crc kubenswrapper[4835]: E1003 19:14:36.080892 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:14:46 crc kubenswrapper[4835]: I1003 19:14:46.877754 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:14:46 crc kubenswrapper[4835]: E1003 19:14:46.878958 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:14:58 crc kubenswrapper[4835]: I1003 19:14:58.888738 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:14:58 crc kubenswrapper[4835]: E1003 19:14:58.890324 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.197333 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v"] Oct 03 19:15:00 crc kubenswrapper[4835]: E1003 19:15:00.197886 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d0dc76-b6b4-419e-b069-32f7eacf15be" containerName="registry-server" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.197905 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d0dc76-b6b4-419e-b069-32f7eacf15be" containerName="registry-server" Oct 03 19:15:00 crc kubenswrapper[4835]: E1003 19:15:00.197916 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2034453-d610-46b0-acf2-399ca863c110" containerName="extract-utilities" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.197924 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2034453-d610-46b0-acf2-399ca863c110" containerName="extract-utilities" Oct 03 19:15:00 crc kubenswrapper[4835]: E1003 19:15:00.197949 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d0dc76-b6b4-419e-b069-32f7eacf15be" containerName="extract-content" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.197957 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d0dc76-b6b4-419e-b069-32f7eacf15be" containerName="extract-content" Oct 03 19:15:00 crc kubenswrapper[4835]: E1003 19:15:00.197977 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2034453-d610-46b0-acf2-399ca863c110" containerName="extract-content" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.197985 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2034453-d610-46b0-acf2-399ca863c110" containerName="extract-content" Oct 03 19:15:00 crc kubenswrapper[4835]: E1003 19:15:00.198007 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2034453-d610-46b0-acf2-399ca863c110" containerName="registry-server" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.198014 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2034453-d610-46b0-acf2-399ca863c110" containerName="registry-server" Oct 03 19:15:00 crc kubenswrapper[4835]: E1003 19:15:00.198039 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerName="extract-utilities" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.198047 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerName="extract-utilities" Oct 03 19:15:00 crc kubenswrapper[4835]: E1003 19:15:00.198092 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerName="registry-server" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.198101 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerName="registry-server" Oct 03 19:15:00 crc kubenswrapper[4835]: E1003 19:15:00.198116 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerName="extract-content" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.198123 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerName="extract-content" Oct 03 19:15:00 crc kubenswrapper[4835]: E1003 19:15:00.198136 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d0dc76-b6b4-419e-b069-32f7eacf15be" containerName="extract-utilities" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.198144 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d0dc76-b6b4-419e-b069-32f7eacf15be" containerName="extract-utilities" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.198384 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d0dc76-b6b4-419e-b069-32f7eacf15be" containerName="registry-server" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.198418 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2034453-d610-46b0-acf2-399ca863c110" containerName="registry-server" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.198431 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa5afca-b50e-4111-8e18-a14835de06f9" containerName="registry-server" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.199178 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.202061 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.213011 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.221843 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v"] Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.235714 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04d298f3-24fe-4e66-8a43-e72c3de44ddf-secret-volume\") pod \"collect-profiles-29325315-b2k8v\" (UID: \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.235901 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrvv\" (UniqueName: \"kubernetes.io/projected/04d298f3-24fe-4e66-8a43-e72c3de44ddf-kube-api-access-xbrvv\") pod \"collect-profiles-29325315-b2k8v\" (UID: \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.235945 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04d298f3-24fe-4e66-8a43-e72c3de44ddf-config-volume\") pod \"collect-profiles-29325315-b2k8v\" (UID: \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.337828 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04d298f3-24fe-4e66-8a43-e72c3de44ddf-secret-volume\") pod \"collect-profiles-29325315-b2k8v\" (UID: \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.337954 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrvv\" (UniqueName: \"kubernetes.io/projected/04d298f3-24fe-4e66-8a43-e72c3de44ddf-kube-api-access-xbrvv\") pod \"collect-profiles-29325315-b2k8v\" (UID: \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.337999 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04d298f3-24fe-4e66-8a43-e72c3de44ddf-config-volume\") pod \"collect-profiles-29325315-b2k8v\" (UID: \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.339085 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04d298f3-24fe-4e66-8a43-e72c3de44ddf-config-volume\") pod \"collect-profiles-29325315-b2k8v\" (UID: \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.347832 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04d298f3-24fe-4e66-8a43-e72c3de44ddf-secret-volume\") pod \"collect-profiles-29325315-b2k8v\" (UID: \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.360269 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrvv\" (UniqueName: \"kubernetes.io/projected/04d298f3-24fe-4e66-8a43-e72c3de44ddf-kube-api-access-xbrvv\") pod \"collect-profiles-29325315-b2k8v\" (UID: \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" Oct 03 19:15:00 crc kubenswrapper[4835]: I1003 19:15:00.527115 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" Oct 03 19:15:01 crc kubenswrapper[4835]: I1003 19:15:01.059176 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v"] Oct 03 19:15:01 crc kubenswrapper[4835]: I1003 19:15:01.409238 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" event={"ID":"04d298f3-24fe-4e66-8a43-e72c3de44ddf","Type":"ContainerStarted","Data":"a4a90d616c828077ef79154e13972a4ffe1b858a544234b01776d2f55fa92e85"} Oct 03 19:15:01 crc kubenswrapper[4835]: I1003 19:15:01.409824 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" event={"ID":"04d298f3-24fe-4e66-8a43-e72c3de44ddf","Type":"ContainerStarted","Data":"9cf6bf745465d0d1f63e5ef56364796c2725a218257e84090fc83432943261e7"} Oct 03 19:15:01 crc kubenswrapper[4835]: I1003 19:15:01.437364 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" podStartSLOduration=1.437336083 podStartE2EDuration="1.437336083s" podCreationTimestamp="2025-10-03 19:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 19:15:01.425855708 +0000 UTC m=+3643.141796580" watchObservedRunningTime="2025-10-03 19:15:01.437336083 +0000 UTC m=+3643.153276945" Oct 03 19:15:02 crc kubenswrapper[4835]: I1003 19:15:02.423797 4835 generic.go:334] "Generic (PLEG): container finished" podID="04d298f3-24fe-4e66-8a43-e72c3de44ddf" containerID="a4a90d616c828077ef79154e13972a4ffe1b858a544234b01776d2f55fa92e85" exitCode=0 Oct 03 19:15:02 crc kubenswrapper[4835]: I1003 19:15:02.423927 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" event={"ID":"04d298f3-24fe-4e66-8a43-e72c3de44ddf","Type":"ContainerDied","Data":"a4a90d616c828077ef79154e13972a4ffe1b858a544234b01776d2f55fa92e85"} Oct 03 19:15:03 crc kubenswrapper[4835]: I1003 19:15:03.812496 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" Oct 03 19:15:03 crc kubenswrapper[4835]: I1003 19:15:03.942388 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04d298f3-24fe-4e66-8a43-e72c3de44ddf-secret-volume\") pod \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\" (UID: \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\") " Oct 03 19:15:03 crc kubenswrapper[4835]: I1003 19:15:03.942591 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04d298f3-24fe-4e66-8a43-e72c3de44ddf-config-volume\") pod \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\" (UID: \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\") " Oct 03 19:15:03 crc kubenswrapper[4835]: I1003 19:15:03.943611 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04d298f3-24fe-4e66-8a43-e72c3de44ddf-config-volume" (OuterVolumeSpecName: "config-volume") pod "04d298f3-24fe-4e66-8a43-e72c3de44ddf" (UID: "04d298f3-24fe-4e66-8a43-e72c3de44ddf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 19:15:03 crc kubenswrapper[4835]: I1003 19:15:03.943754 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbrvv\" (UniqueName: \"kubernetes.io/projected/04d298f3-24fe-4e66-8a43-e72c3de44ddf-kube-api-access-xbrvv\") pod \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\" (UID: \"04d298f3-24fe-4e66-8a43-e72c3de44ddf\") " Oct 03 19:15:03 crc kubenswrapper[4835]: I1003 19:15:03.945943 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04d298f3-24fe-4e66-8a43-e72c3de44ddf-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 19:15:03 crc kubenswrapper[4835]: I1003 19:15:03.949377 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d298f3-24fe-4e66-8a43-e72c3de44ddf-kube-api-access-xbrvv" (OuterVolumeSpecName: "kube-api-access-xbrvv") pod "04d298f3-24fe-4e66-8a43-e72c3de44ddf" (UID: "04d298f3-24fe-4e66-8a43-e72c3de44ddf"). InnerVolumeSpecName "kube-api-access-xbrvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:15:03 crc kubenswrapper[4835]: I1003 19:15:03.950621 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04d298f3-24fe-4e66-8a43-e72c3de44ddf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "04d298f3-24fe-4e66-8a43-e72c3de44ddf" (UID: "04d298f3-24fe-4e66-8a43-e72c3de44ddf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:15:04 crc kubenswrapper[4835]: I1003 19:15:04.047816 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04d298f3-24fe-4e66-8a43-e72c3de44ddf-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 19:15:04 crc kubenswrapper[4835]: I1003 19:15:04.048117 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbrvv\" (UniqueName: \"kubernetes.io/projected/04d298f3-24fe-4e66-8a43-e72c3de44ddf-kube-api-access-xbrvv\") on node \"crc\" DevicePath \"\"" Oct 03 19:15:04 crc kubenswrapper[4835]: I1003 19:15:04.450723 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" event={"ID":"04d298f3-24fe-4e66-8a43-e72c3de44ddf","Type":"ContainerDied","Data":"9cf6bf745465d0d1f63e5ef56364796c2725a218257e84090fc83432943261e7"} Oct 03 19:15:04 crc kubenswrapper[4835]: I1003 19:15:04.450798 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cf6bf745465d0d1f63e5ef56364796c2725a218257e84090fc83432943261e7" Oct 03 19:15:04 crc kubenswrapper[4835]: I1003 19:15:04.450874 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v" Oct 03 19:15:04 crc kubenswrapper[4835]: I1003 19:15:04.539025 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr"] Oct 03 19:15:04 crc kubenswrapper[4835]: I1003 19:15:04.547733 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325270-9jcbr"] Oct 03 19:15:04 crc kubenswrapper[4835]: I1003 19:15:04.897520 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c91cd3a2-2f89-4d78-b78f-09b6c2851f18" path="/var/lib/kubelet/pods/c91cd3a2-2f89-4d78-b78f-09b6c2851f18/volumes" Oct 03 19:15:09 crc kubenswrapper[4835]: I1003 19:15:09.877869 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:15:09 crc kubenswrapper[4835]: E1003 19:15:09.879266 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:15:23 crc kubenswrapper[4835]: I1003 19:15:23.880855 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:15:23 crc kubenswrapper[4835]: E1003 19:15:23.881890 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:15:36 crc kubenswrapper[4835]: I1003 19:15:36.876597 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:15:36 crc kubenswrapper[4835]: E1003 19:15:36.877656 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:15:43 crc kubenswrapper[4835]: I1003 19:15:43.896983 4835 scope.go:117] "RemoveContainer" containerID="7e01af0729f0f491d3deed60f3a11e956efe3ac1486aef74a6796d66736f18e0" Oct 03 19:15:49 crc kubenswrapper[4835]: I1003 19:15:49.876571 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:15:49 crc kubenswrapper[4835]: E1003 19:15:49.877279 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:16:00 crc kubenswrapper[4835]: I1003 19:16:00.877659 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:16:00 crc kubenswrapper[4835]: E1003 19:16:00.878721 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:16:14 crc kubenswrapper[4835]: I1003 19:16:14.877617 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:16:14 crc kubenswrapper[4835]: E1003 19:16:14.879943 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:16:27 crc kubenswrapper[4835]: I1003 19:16:27.878397 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:16:27 crc kubenswrapper[4835]: E1003 19:16:27.879363 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:16:38 crc kubenswrapper[4835]: I1003 19:16:38.892745 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:16:38 crc kubenswrapper[4835]: E1003 19:16:38.894602 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:16:49 crc kubenswrapper[4835]: I1003 19:16:49.879423 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:16:49 crc kubenswrapper[4835]: E1003 19:16:49.882172 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:17:00 crc kubenswrapper[4835]: I1003 19:17:00.876736 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:17:00 crc kubenswrapper[4835]: E1003 19:17:00.877807 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:17:15 crc kubenswrapper[4835]: I1003 19:17:15.877598 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:17:15 crc kubenswrapper[4835]: E1003 19:17:15.880221 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:17:27 crc kubenswrapper[4835]: I1003 19:17:27.878372 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:17:27 crc kubenswrapper[4835]: E1003 19:17:27.879824 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:17:40 crc kubenswrapper[4835]: I1003 19:17:40.877220 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:17:40 crc kubenswrapper[4835]: E1003 19:17:40.878427 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:17:53 crc kubenswrapper[4835]: I1003 19:17:53.877492 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:17:53 crc kubenswrapper[4835]: E1003 19:17:53.878422 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:18:04 crc kubenswrapper[4835]: I1003 19:18:04.878410 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:18:04 crc kubenswrapper[4835]: E1003 19:18:04.881045 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:18:18 crc kubenswrapper[4835]: I1003 19:18:18.885298 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:18:18 crc kubenswrapper[4835]: E1003 19:18:18.886478 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:18:29 crc kubenswrapper[4835]: I1003 19:18:29.877382 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:18:29 crc kubenswrapper[4835]: E1003 19:18:29.878390 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:18:43 crc kubenswrapper[4835]: I1003 19:18:43.877474 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:18:43 crc kubenswrapper[4835]: E1003 19:18:43.878648 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:18:56 crc kubenswrapper[4835]: I1003 19:18:56.876896 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:18:56 crc kubenswrapper[4835]: E1003 19:18:56.878014 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:19:10 crc kubenswrapper[4835]: I1003 19:19:10.876763 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:19:10 crc kubenswrapper[4835]: E1003 19:19:10.877988 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:19:23 crc kubenswrapper[4835]: I1003 19:19:23.877641 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:19:23 crc kubenswrapper[4835]: E1003 19:19:23.878754 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:19:34 crc kubenswrapper[4835]: I1003 19:19:34.878324 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:19:34 crc kubenswrapper[4835]: E1003 19:19:34.879455 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:19:48 crc kubenswrapper[4835]: I1003 19:19:48.884890 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:19:49 crc kubenswrapper[4835]: I1003 19:19:49.587625 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"7c20e7f2b0dbe3c17d0a25bc788f6b9d7f678355c89d94dd6788d1509c507bed"} Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.591326 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cq7np"] Oct 03 19:20:31 crc kubenswrapper[4835]: E1003 19:20:31.592588 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d298f3-24fe-4e66-8a43-e72c3de44ddf" containerName="collect-profiles" Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.592610 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d298f3-24fe-4e66-8a43-e72c3de44ddf" containerName="collect-profiles" Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.592908 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d298f3-24fe-4e66-8a43-e72c3de44ddf" containerName="collect-profiles" Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.594945 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.628508 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cq7np"] Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.672610 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5f2r\" (UniqueName: \"kubernetes.io/projected/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-kube-api-access-f5f2r\") pod \"certified-operators-cq7np\" (UID: \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\") " pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.672713 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-utilities\") pod \"certified-operators-cq7np\" (UID: \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\") " pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.672770 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-catalog-content\") pod \"certified-operators-cq7np\" (UID: \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\") " pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.774347 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-utilities\") pod \"certified-operators-cq7np\" (UID: \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\") " pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.774426 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-catalog-content\") pod \"certified-operators-cq7np\" (UID: \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\") " pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.774532 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5f2r\" (UniqueName: \"kubernetes.io/projected/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-kube-api-access-f5f2r\") pod \"certified-operators-cq7np\" (UID: \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\") " pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.774809 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-utilities\") pod \"certified-operators-cq7np\" (UID: \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\") " pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.775098 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-catalog-content\") pod \"certified-operators-cq7np\" (UID: \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\") " pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.795938 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5f2r\" (UniqueName: \"kubernetes.io/projected/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-kube-api-access-f5f2r\") pod \"certified-operators-cq7np\" (UID: \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\") " pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:31 crc kubenswrapper[4835]: I1003 19:20:31.930424 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:32 crc kubenswrapper[4835]: I1003 19:20:32.476180 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cq7np"] Oct 03 19:20:33 crc kubenswrapper[4835]: I1003 19:20:33.062595 4835 generic.go:334] "Generic (PLEG): container finished" podID="f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" containerID="2eb8dfa3df8c03a06ea26e5839a1be0d8c50e0c49e4e92a980ff1f990b5a3e7e" exitCode=0 Oct 03 19:20:33 crc kubenswrapper[4835]: I1003 19:20:33.062696 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cq7np" event={"ID":"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d","Type":"ContainerDied","Data":"2eb8dfa3df8c03a06ea26e5839a1be0d8c50e0c49e4e92a980ff1f990b5a3e7e"} Oct 03 19:20:33 crc kubenswrapper[4835]: I1003 19:20:33.062789 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cq7np" event={"ID":"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d","Type":"ContainerStarted","Data":"d4afb21446527fe5dd8ee296767cb5912988be0c374a749715026f5c166122c1"} Oct 03 19:20:33 crc kubenswrapper[4835]: I1003 19:20:33.064812 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 19:20:35 crc kubenswrapper[4835]: I1003 19:20:35.089854 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cq7np" event={"ID":"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d","Type":"ContainerStarted","Data":"e42d4bf1ce94a253a4916f7f95ff1354e0a9bfa1eba9a9db10665cd898182feb"} Oct 03 19:20:36 crc kubenswrapper[4835]: I1003 19:20:36.104840 4835 generic.go:334] "Generic (PLEG): container finished" podID="f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" containerID="e42d4bf1ce94a253a4916f7f95ff1354e0a9bfa1eba9a9db10665cd898182feb" exitCode=0 Oct 03 19:20:36 crc kubenswrapper[4835]: I1003 19:20:36.105379 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cq7np" event={"ID":"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d","Type":"ContainerDied","Data":"e42d4bf1ce94a253a4916f7f95ff1354e0a9bfa1eba9a9db10665cd898182feb"} Oct 03 19:20:37 crc kubenswrapper[4835]: I1003 19:20:37.119918 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cq7np" event={"ID":"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d","Type":"ContainerStarted","Data":"b167ffa2699e261da9cde9d6d16d7f613a0dc46c50cb3df287abc89557e77137"} Oct 03 19:20:37 crc kubenswrapper[4835]: I1003 19:20:37.151289 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cq7np" podStartSLOduration=2.637815963 podStartE2EDuration="6.151268387s" podCreationTimestamp="2025-10-03 19:20:31 +0000 UTC" firstStartedPulling="2025-10-03 19:20:33.064565684 +0000 UTC m=+3974.780506546" lastFinishedPulling="2025-10-03 19:20:36.578018068 +0000 UTC m=+3978.293958970" observedRunningTime="2025-10-03 19:20:37.14694547 +0000 UTC m=+3978.862886342" watchObservedRunningTime="2025-10-03 19:20:37.151268387 +0000 UTC m=+3978.867209249" Oct 03 19:20:41 crc kubenswrapper[4835]: I1003 19:20:41.930754 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:41 crc kubenswrapper[4835]: I1003 19:20:41.931461 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:41 crc kubenswrapper[4835]: I1003 19:20:41.979495 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:42 crc kubenswrapper[4835]: I1003 19:20:42.219358 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:42 crc kubenswrapper[4835]: I1003 19:20:42.264148 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cq7np"] Oct 03 19:20:44 crc kubenswrapper[4835]: I1003 19:20:44.188410 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cq7np" podUID="f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" containerName="registry-server" containerID="cri-o://b167ffa2699e261da9cde9d6d16d7f613a0dc46c50cb3df287abc89557e77137" gracePeriod=2 Oct 03 19:20:44 crc kubenswrapper[4835]: I1003 19:20:44.706274 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:44 crc kubenswrapper[4835]: I1003 19:20:44.781331 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-catalog-content\") pod \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\" (UID: \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\") " Oct 03 19:20:44 crc kubenswrapper[4835]: I1003 19:20:44.781501 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5f2r\" (UniqueName: \"kubernetes.io/projected/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-kube-api-access-f5f2r\") pod \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\" (UID: \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\") " Oct 03 19:20:44 crc kubenswrapper[4835]: I1003 19:20:44.781744 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-utilities\") pod \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\" (UID: \"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d\") " Oct 03 19:20:44 crc kubenswrapper[4835]: I1003 19:20:44.782881 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-utilities" (OuterVolumeSpecName: "utilities") pod "f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" (UID: "f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:20:44 crc kubenswrapper[4835]: I1003 19:20:44.790417 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-kube-api-access-f5f2r" (OuterVolumeSpecName: "kube-api-access-f5f2r") pod "f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" (UID: "f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d"). InnerVolumeSpecName "kube-api-access-f5f2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:20:44 crc kubenswrapper[4835]: I1003 19:20:44.827801 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" (UID: "f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:20:44 crc kubenswrapper[4835]: I1003 19:20:44.884314 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5f2r\" (UniqueName: \"kubernetes.io/projected/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-kube-api-access-f5f2r\") on node \"crc\" DevicePath \"\"" Oct 03 19:20:44 crc kubenswrapper[4835]: I1003 19:20:44.884351 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:20:44 crc kubenswrapper[4835]: I1003 19:20:44.884362 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.201251 4835 generic.go:334] "Generic (PLEG): container finished" podID="f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" containerID="b167ffa2699e261da9cde9d6d16d7f613a0dc46c50cb3df287abc89557e77137" exitCode=0 Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.201326 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cq7np" event={"ID":"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d","Type":"ContainerDied","Data":"b167ffa2699e261da9cde9d6d16d7f613a0dc46c50cb3df287abc89557e77137"} Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.201391 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cq7np" event={"ID":"f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d","Type":"ContainerDied","Data":"d4afb21446527fe5dd8ee296767cb5912988be0c374a749715026f5c166122c1"} Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.201416 4835 scope.go:117] "RemoveContainer" containerID="b167ffa2699e261da9cde9d6d16d7f613a0dc46c50cb3df287abc89557e77137" Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.201713 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cq7np" Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.226013 4835 scope.go:117] "RemoveContainer" containerID="e42d4bf1ce94a253a4916f7f95ff1354e0a9bfa1eba9a9db10665cd898182feb" Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.234149 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cq7np"] Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.245689 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cq7np"] Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.256675 4835 scope.go:117] "RemoveContainer" containerID="2eb8dfa3df8c03a06ea26e5839a1be0d8c50e0c49e4e92a980ff1f990b5a3e7e" Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.304995 4835 scope.go:117] "RemoveContainer" containerID="b167ffa2699e261da9cde9d6d16d7f613a0dc46c50cb3df287abc89557e77137" Oct 03 19:20:45 crc kubenswrapper[4835]: E1003 19:20:45.305542 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b167ffa2699e261da9cde9d6d16d7f613a0dc46c50cb3df287abc89557e77137\": container with ID starting with b167ffa2699e261da9cde9d6d16d7f613a0dc46c50cb3df287abc89557e77137 not found: ID does not exist" containerID="b167ffa2699e261da9cde9d6d16d7f613a0dc46c50cb3df287abc89557e77137" Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.305587 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b167ffa2699e261da9cde9d6d16d7f613a0dc46c50cb3df287abc89557e77137"} err="failed to get container status \"b167ffa2699e261da9cde9d6d16d7f613a0dc46c50cb3df287abc89557e77137\": rpc error: code = NotFound desc = could not find container \"b167ffa2699e261da9cde9d6d16d7f613a0dc46c50cb3df287abc89557e77137\": container with ID starting with b167ffa2699e261da9cde9d6d16d7f613a0dc46c50cb3df287abc89557e77137 not found: ID does not exist" Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.305616 4835 scope.go:117] "RemoveContainer" containerID="e42d4bf1ce94a253a4916f7f95ff1354e0a9bfa1eba9a9db10665cd898182feb" Oct 03 19:20:45 crc kubenswrapper[4835]: E1003 19:20:45.305968 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e42d4bf1ce94a253a4916f7f95ff1354e0a9bfa1eba9a9db10665cd898182feb\": container with ID starting with e42d4bf1ce94a253a4916f7f95ff1354e0a9bfa1eba9a9db10665cd898182feb not found: ID does not exist" containerID="e42d4bf1ce94a253a4916f7f95ff1354e0a9bfa1eba9a9db10665cd898182feb" Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.305998 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e42d4bf1ce94a253a4916f7f95ff1354e0a9bfa1eba9a9db10665cd898182feb"} err="failed to get container status \"e42d4bf1ce94a253a4916f7f95ff1354e0a9bfa1eba9a9db10665cd898182feb\": rpc error: code = NotFound desc = could not find container \"e42d4bf1ce94a253a4916f7f95ff1354e0a9bfa1eba9a9db10665cd898182feb\": container with ID starting with e42d4bf1ce94a253a4916f7f95ff1354e0a9bfa1eba9a9db10665cd898182feb not found: ID does not exist" Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.306018 4835 scope.go:117] "RemoveContainer" containerID="2eb8dfa3df8c03a06ea26e5839a1be0d8c50e0c49e4e92a980ff1f990b5a3e7e" Oct 03 19:20:45 crc kubenswrapper[4835]: E1003 19:20:45.306320 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eb8dfa3df8c03a06ea26e5839a1be0d8c50e0c49e4e92a980ff1f990b5a3e7e\": container with ID starting with 2eb8dfa3df8c03a06ea26e5839a1be0d8c50e0c49e4e92a980ff1f990b5a3e7e not found: ID does not exist" containerID="2eb8dfa3df8c03a06ea26e5839a1be0d8c50e0c49e4e92a980ff1f990b5a3e7e" Oct 03 19:20:45 crc kubenswrapper[4835]: I1003 19:20:45.306348 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb8dfa3df8c03a06ea26e5839a1be0d8c50e0c49e4e92a980ff1f990b5a3e7e"} err="failed to get container status \"2eb8dfa3df8c03a06ea26e5839a1be0d8c50e0c49e4e92a980ff1f990b5a3e7e\": rpc error: code = NotFound desc = could not find container \"2eb8dfa3df8c03a06ea26e5839a1be0d8c50e0c49e4e92a980ff1f990b5a3e7e\": container with ID starting with 2eb8dfa3df8c03a06ea26e5839a1be0d8c50e0c49e4e92a980ff1f990b5a3e7e not found: ID does not exist" Oct 03 19:20:46 crc kubenswrapper[4835]: I1003 19:20:46.891272 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" path="/var/lib/kubelet/pods/f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d/volumes" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.411940 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6d7hq"] Oct 03 19:21:12 crc kubenswrapper[4835]: E1003 19:21:12.413381 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" containerName="extract-content" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.413402 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" containerName="extract-content" Oct 03 19:21:12 crc kubenswrapper[4835]: E1003 19:21:12.413422 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" containerName="registry-server" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.413430 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" containerName="registry-server" Oct 03 19:21:12 crc kubenswrapper[4835]: E1003 19:21:12.413456 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" containerName="extract-utilities" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.413465 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" containerName="extract-utilities" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.413769 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f56df34f-2eb7-4a5f-8ae5-1ec8d2d99c2d" containerName="registry-server" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.415975 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.434285 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6d7hq"] Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.571357 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjcrv\" (UniqueName: \"kubernetes.io/projected/1ae9321c-ef12-4406-bcab-3c93114930ce-kube-api-access-mjcrv\") pod \"redhat-marketplace-6d7hq\" (UID: \"1ae9321c-ef12-4406-bcab-3c93114930ce\") " pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.571472 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ae9321c-ef12-4406-bcab-3c93114930ce-catalog-content\") pod \"redhat-marketplace-6d7hq\" (UID: \"1ae9321c-ef12-4406-bcab-3c93114930ce\") " pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.571503 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ae9321c-ef12-4406-bcab-3c93114930ce-utilities\") pod \"redhat-marketplace-6d7hq\" (UID: \"1ae9321c-ef12-4406-bcab-3c93114930ce\") " pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.611533 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8pzrl"] Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.614688 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.625958 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8pzrl"] Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.674692 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljq9d\" (UniqueName: \"kubernetes.io/projected/d98e9af1-21a3-4f0e-b643-6ad48d24051b-kube-api-access-ljq9d\") pod \"redhat-operators-8pzrl\" (UID: \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\") " pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.674844 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d98e9af1-21a3-4f0e-b643-6ad48d24051b-utilities\") pod \"redhat-operators-8pzrl\" (UID: \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\") " pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.674915 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ae9321c-ef12-4406-bcab-3c93114930ce-catalog-content\") pod \"redhat-marketplace-6d7hq\" (UID: \"1ae9321c-ef12-4406-bcab-3c93114930ce\") " pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.674956 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ae9321c-ef12-4406-bcab-3c93114930ce-utilities\") pod \"redhat-marketplace-6d7hq\" (UID: \"1ae9321c-ef12-4406-bcab-3c93114930ce\") " pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.675157 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d98e9af1-21a3-4f0e-b643-6ad48d24051b-catalog-content\") pod \"redhat-operators-8pzrl\" (UID: \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\") " pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.675401 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjcrv\" (UniqueName: \"kubernetes.io/projected/1ae9321c-ef12-4406-bcab-3c93114930ce-kube-api-access-mjcrv\") pod \"redhat-marketplace-6d7hq\" (UID: \"1ae9321c-ef12-4406-bcab-3c93114930ce\") " pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.675531 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ae9321c-ef12-4406-bcab-3c93114930ce-catalog-content\") pod \"redhat-marketplace-6d7hq\" (UID: \"1ae9321c-ef12-4406-bcab-3c93114930ce\") " pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.676120 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ae9321c-ef12-4406-bcab-3c93114930ce-utilities\") pod \"redhat-marketplace-6d7hq\" (UID: \"1ae9321c-ef12-4406-bcab-3c93114930ce\") " pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.709510 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjcrv\" (UniqueName: \"kubernetes.io/projected/1ae9321c-ef12-4406-bcab-3c93114930ce-kube-api-access-mjcrv\") pod \"redhat-marketplace-6d7hq\" (UID: \"1ae9321c-ef12-4406-bcab-3c93114930ce\") " pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.742477 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.778226 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljq9d\" (UniqueName: \"kubernetes.io/projected/d98e9af1-21a3-4f0e-b643-6ad48d24051b-kube-api-access-ljq9d\") pod \"redhat-operators-8pzrl\" (UID: \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\") " pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.778507 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d98e9af1-21a3-4f0e-b643-6ad48d24051b-utilities\") pod \"redhat-operators-8pzrl\" (UID: \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\") " pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.778612 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d98e9af1-21a3-4f0e-b643-6ad48d24051b-catalog-content\") pod \"redhat-operators-8pzrl\" (UID: \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\") " pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.779437 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d98e9af1-21a3-4f0e-b643-6ad48d24051b-catalog-content\") pod \"redhat-operators-8pzrl\" (UID: \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\") " pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.779843 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d98e9af1-21a3-4f0e-b643-6ad48d24051b-utilities\") pod \"redhat-operators-8pzrl\" (UID: \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\") " pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.799962 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljq9d\" (UniqueName: \"kubernetes.io/projected/d98e9af1-21a3-4f0e-b643-6ad48d24051b-kube-api-access-ljq9d\") pod \"redhat-operators-8pzrl\" (UID: \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\") " pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:12 crc kubenswrapper[4835]: I1003 19:21:12.936341 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:13 crc kubenswrapper[4835]: I1003 19:21:13.332056 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6d7hq"] Oct 03 19:21:13 crc kubenswrapper[4835]: I1003 19:21:13.503532 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6d7hq" event={"ID":"1ae9321c-ef12-4406-bcab-3c93114930ce","Type":"ContainerStarted","Data":"357f04782f3ecf4d900cfd86c454d246062b7580f99249c110f960b2883499d3"} Oct 03 19:21:13 crc kubenswrapper[4835]: I1003 19:21:13.522591 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8pzrl"] Oct 03 19:21:13 crc kubenswrapper[4835]: W1003 19:21:13.523024 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd98e9af1_21a3_4f0e_b643_6ad48d24051b.slice/crio-26b749b40c9d0e0fb97ac7cc95053660c8e61f8c3a0f00a71135f739b9a85d6e WatchSource:0}: Error finding container 26b749b40c9d0e0fb97ac7cc95053660c8e61f8c3a0f00a71135f739b9a85d6e: Status 404 returned error can't find the container with id 26b749b40c9d0e0fb97ac7cc95053660c8e61f8c3a0f00a71135f739b9a85d6e Oct 03 19:21:14 crc kubenswrapper[4835]: I1003 19:21:14.516973 4835 generic.go:334] "Generic (PLEG): container finished" podID="1ae9321c-ef12-4406-bcab-3c93114930ce" containerID="b6316759f79a47d46855fbb9ee838370c66265c71b58f6e92f22e7e3053bc8be" exitCode=0 Oct 03 19:21:14 crc kubenswrapper[4835]: I1003 19:21:14.517034 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6d7hq" event={"ID":"1ae9321c-ef12-4406-bcab-3c93114930ce","Type":"ContainerDied","Data":"b6316759f79a47d46855fbb9ee838370c66265c71b58f6e92f22e7e3053bc8be"} Oct 03 19:21:14 crc kubenswrapper[4835]: I1003 19:21:14.522108 4835 generic.go:334] "Generic (PLEG): container finished" podID="d98e9af1-21a3-4f0e-b643-6ad48d24051b" containerID="7db82f937cd0aa7254c69998051950e27456f850a0511aae3c1b67a1fa2e9868" exitCode=0 Oct 03 19:21:14 crc kubenswrapper[4835]: I1003 19:21:14.522373 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pzrl" event={"ID":"d98e9af1-21a3-4f0e-b643-6ad48d24051b","Type":"ContainerDied","Data":"7db82f937cd0aa7254c69998051950e27456f850a0511aae3c1b67a1fa2e9868"} Oct 03 19:21:14 crc kubenswrapper[4835]: I1003 19:21:14.522410 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pzrl" event={"ID":"d98e9af1-21a3-4f0e-b643-6ad48d24051b","Type":"ContainerStarted","Data":"26b749b40c9d0e0fb97ac7cc95053660c8e61f8c3a0f00a71135f739b9a85d6e"} Oct 03 19:21:16 crc kubenswrapper[4835]: I1003 19:21:16.553234 4835 generic.go:334] "Generic (PLEG): container finished" podID="1ae9321c-ef12-4406-bcab-3c93114930ce" containerID="afd6e0e7b704c387faabd44a6ffd8bd4f42461020d892f750abcf65bb7841723" exitCode=0 Oct 03 19:21:16 crc kubenswrapper[4835]: I1003 19:21:16.553302 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6d7hq" event={"ID":"1ae9321c-ef12-4406-bcab-3c93114930ce","Type":"ContainerDied","Data":"afd6e0e7b704c387faabd44a6ffd8bd4f42461020d892f750abcf65bb7841723"} Oct 03 19:21:16 crc kubenswrapper[4835]: I1003 19:21:16.561972 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pzrl" event={"ID":"d98e9af1-21a3-4f0e-b643-6ad48d24051b","Type":"ContainerStarted","Data":"3f63a955079d504f1749200cc9fd5101ac8258a4a20c6b1d2b95e7579628230d"} Oct 03 19:21:17 crc kubenswrapper[4835]: I1003 19:21:17.578606 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6d7hq" event={"ID":"1ae9321c-ef12-4406-bcab-3c93114930ce","Type":"ContainerStarted","Data":"d643c85e616bc392d610cff574069320d600a12e0e58abb1cdaea7db778553d2"} Oct 03 19:21:17 crc kubenswrapper[4835]: I1003 19:21:17.611500 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6d7hq" podStartSLOduration=3.074352651 podStartE2EDuration="5.611465252s" podCreationTimestamp="2025-10-03 19:21:12 +0000 UTC" firstStartedPulling="2025-10-03 19:21:14.51967453 +0000 UTC m=+4016.235615422" lastFinishedPulling="2025-10-03 19:21:17.056787141 +0000 UTC m=+4018.772728023" observedRunningTime="2025-10-03 19:21:17.602929733 +0000 UTC m=+4019.318870615" watchObservedRunningTime="2025-10-03 19:21:17.611465252 +0000 UTC m=+4019.327406164" Oct 03 19:21:18 crc kubenswrapper[4835]: I1003 19:21:18.596541 4835 generic.go:334] "Generic (PLEG): container finished" podID="d98e9af1-21a3-4f0e-b643-6ad48d24051b" containerID="3f63a955079d504f1749200cc9fd5101ac8258a4a20c6b1d2b95e7579628230d" exitCode=0 Oct 03 19:21:18 crc kubenswrapper[4835]: I1003 19:21:18.596694 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pzrl" event={"ID":"d98e9af1-21a3-4f0e-b643-6ad48d24051b","Type":"ContainerDied","Data":"3f63a955079d504f1749200cc9fd5101ac8258a4a20c6b1d2b95e7579628230d"} Oct 03 19:21:19 crc kubenswrapper[4835]: I1003 19:21:19.011846 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bj2ts"] Oct 03 19:21:19 crc kubenswrapper[4835]: I1003 19:21:19.014472 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:19 crc kubenswrapper[4835]: I1003 19:21:19.036426 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bj2ts"] Oct 03 19:21:19 crc kubenswrapper[4835]: I1003 19:21:19.155164 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnnwd\" (UniqueName: \"kubernetes.io/projected/7504d158-8ab0-443e-ae46-01fb54b9f4d3-kube-api-access-jnnwd\") pod \"community-operators-bj2ts\" (UID: \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\") " pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:19 crc kubenswrapper[4835]: I1003 19:21:19.155315 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7504d158-8ab0-443e-ae46-01fb54b9f4d3-catalog-content\") pod \"community-operators-bj2ts\" (UID: \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\") " pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:19 crc kubenswrapper[4835]: I1003 19:21:19.155971 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7504d158-8ab0-443e-ae46-01fb54b9f4d3-utilities\") pod \"community-operators-bj2ts\" (UID: \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\") " pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:19 crc kubenswrapper[4835]: I1003 19:21:19.258900 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7504d158-8ab0-443e-ae46-01fb54b9f4d3-utilities\") pod \"community-operators-bj2ts\" (UID: \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\") " pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:19 crc kubenswrapper[4835]: I1003 19:21:19.259037 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnnwd\" (UniqueName: \"kubernetes.io/projected/7504d158-8ab0-443e-ae46-01fb54b9f4d3-kube-api-access-jnnwd\") pod \"community-operators-bj2ts\" (UID: \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\") " pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:19 crc kubenswrapper[4835]: I1003 19:21:19.259104 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7504d158-8ab0-443e-ae46-01fb54b9f4d3-catalog-content\") pod \"community-operators-bj2ts\" (UID: \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\") " pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:19 crc kubenswrapper[4835]: I1003 19:21:19.259736 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7504d158-8ab0-443e-ae46-01fb54b9f4d3-utilities\") pod \"community-operators-bj2ts\" (UID: \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\") " pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:19 crc kubenswrapper[4835]: I1003 19:21:19.259883 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7504d158-8ab0-443e-ae46-01fb54b9f4d3-catalog-content\") pod \"community-operators-bj2ts\" (UID: \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\") " pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:19 crc kubenswrapper[4835]: I1003 19:21:19.293099 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnnwd\" (UniqueName: \"kubernetes.io/projected/7504d158-8ab0-443e-ae46-01fb54b9f4d3-kube-api-access-jnnwd\") pod \"community-operators-bj2ts\" (UID: \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\") " pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:19 crc kubenswrapper[4835]: I1003 19:21:19.340433 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:20 crc kubenswrapper[4835]: I1003 19:21:20.048339 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bj2ts"] Oct 03 19:21:20 crc kubenswrapper[4835]: I1003 19:21:20.657277 4835 generic.go:334] "Generic (PLEG): container finished" podID="7504d158-8ab0-443e-ae46-01fb54b9f4d3" containerID="803374c526cd2c393a5fdb1d998571c9170bcce4768263cf235753fb9cdbb763" exitCode=0 Oct 03 19:21:20 crc kubenswrapper[4835]: I1003 19:21:20.657375 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj2ts" event={"ID":"7504d158-8ab0-443e-ae46-01fb54b9f4d3","Type":"ContainerDied","Data":"803374c526cd2c393a5fdb1d998571c9170bcce4768263cf235753fb9cdbb763"} Oct 03 19:21:20 crc kubenswrapper[4835]: I1003 19:21:20.657898 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj2ts" event={"ID":"7504d158-8ab0-443e-ae46-01fb54b9f4d3","Type":"ContainerStarted","Data":"d577a49c2f6f5169eafdd4681b38898c3521989318ef0413cd50c107adcf859d"} Oct 03 19:21:20 crc kubenswrapper[4835]: I1003 19:21:20.662847 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pzrl" event={"ID":"d98e9af1-21a3-4f0e-b643-6ad48d24051b","Type":"ContainerStarted","Data":"6ca9d8ef07bd0d129039bda7d7208f9fc3c2510332994b62b06c7da9b1f150b6"} Oct 03 19:21:20 crc kubenswrapper[4835]: I1003 19:21:20.712642 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8pzrl" podStartSLOduration=3.789076686 podStartE2EDuration="8.712607814s" podCreationTimestamp="2025-10-03 19:21:12 +0000 UTC" firstStartedPulling="2025-10-03 19:21:14.524881698 +0000 UTC m=+4016.240822580" lastFinishedPulling="2025-10-03 19:21:19.448412836 +0000 UTC m=+4021.164353708" observedRunningTime="2025-10-03 19:21:20.712089072 +0000 UTC m=+4022.428029954" watchObservedRunningTime="2025-10-03 19:21:20.712607814 +0000 UTC m=+4022.428548686" Oct 03 19:21:22 crc kubenswrapper[4835]: I1003 19:21:22.687246 4835 generic.go:334] "Generic (PLEG): container finished" podID="7504d158-8ab0-443e-ae46-01fb54b9f4d3" containerID="78817c63a9dc593badeb68801a27abec9ff23a1c408ed98581300e6daac03f08" exitCode=0 Oct 03 19:21:22 crc kubenswrapper[4835]: I1003 19:21:22.687340 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj2ts" event={"ID":"7504d158-8ab0-443e-ae46-01fb54b9f4d3","Type":"ContainerDied","Data":"78817c63a9dc593badeb68801a27abec9ff23a1c408ed98581300e6daac03f08"} Oct 03 19:21:22 crc kubenswrapper[4835]: I1003 19:21:22.743115 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:22 crc kubenswrapper[4835]: I1003 19:21:22.743173 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:22 crc kubenswrapper[4835]: I1003 19:21:22.802549 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:22 crc kubenswrapper[4835]: I1003 19:21:22.935960 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:22 crc kubenswrapper[4835]: I1003 19:21:22.938904 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:23 crc kubenswrapper[4835]: I1003 19:21:23.701515 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj2ts" event={"ID":"7504d158-8ab0-443e-ae46-01fb54b9f4d3","Type":"ContainerStarted","Data":"aa794c6a10887566928b49fac80d7108ac3629ad4193198d86c71b5b3068155d"} Oct 03 19:21:23 crc kubenswrapper[4835]: I1003 19:21:23.723814 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bj2ts" podStartSLOduration=3.170946599 podStartE2EDuration="5.723793775s" podCreationTimestamp="2025-10-03 19:21:18 +0000 UTC" firstStartedPulling="2025-10-03 19:21:20.659734705 +0000 UTC m=+4022.375675577" lastFinishedPulling="2025-10-03 19:21:23.212581881 +0000 UTC m=+4024.928522753" observedRunningTime="2025-10-03 19:21:23.72074494 +0000 UTC m=+4025.436685812" watchObservedRunningTime="2025-10-03 19:21:23.723793775 +0000 UTC m=+4025.439734647" Oct 03 19:21:23 crc kubenswrapper[4835]: I1003 19:21:23.760679 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:23 crc kubenswrapper[4835]: I1003 19:21:23.985951 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8pzrl" podUID="d98e9af1-21a3-4f0e-b643-6ad48d24051b" containerName="registry-server" probeResult="failure" output=< Oct 03 19:21:23 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Oct 03 19:21:23 crc kubenswrapper[4835]: > Oct 03 19:21:27 crc kubenswrapper[4835]: I1003 19:21:27.599875 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6d7hq"] Oct 03 19:21:27 crc kubenswrapper[4835]: I1003 19:21:27.600794 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6d7hq" podUID="1ae9321c-ef12-4406-bcab-3c93114930ce" containerName="registry-server" containerID="cri-o://d643c85e616bc392d610cff574069320d600a12e0e58abb1cdaea7db778553d2" gracePeriod=2 Oct 03 19:21:27 crc kubenswrapper[4835]: I1003 19:21:27.743941 4835 generic.go:334] "Generic (PLEG): container finished" podID="1ae9321c-ef12-4406-bcab-3c93114930ce" containerID="d643c85e616bc392d610cff574069320d600a12e0e58abb1cdaea7db778553d2" exitCode=0 Oct 03 19:21:27 crc kubenswrapper[4835]: I1003 19:21:27.743991 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6d7hq" event={"ID":"1ae9321c-ef12-4406-bcab-3c93114930ce","Type":"ContainerDied","Data":"d643c85e616bc392d610cff574069320d600a12e0e58abb1cdaea7db778553d2"} Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.104921 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.276671 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ae9321c-ef12-4406-bcab-3c93114930ce-catalog-content\") pod \"1ae9321c-ef12-4406-bcab-3c93114930ce\" (UID: \"1ae9321c-ef12-4406-bcab-3c93114930ce\") " Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.276808 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjcrv\" (UniqueName: \"kubernetes.io/projected/1ae9321c-ef12-4406-bcab-3c93114930ce-kube-api-access-mjcrv\") pod \"1ae9321c-ef12-4406-bcab-3c93114930ce\" (UID: \"1ae9321c-ef12-4406-bcab-3c93114930ce\") " Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.277207 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ae9321c-ef12-4406-bcab-3c93114930ce-utilities\") pod \"1ae9321c-ef12-4406-bcab-3c93114930ce\" (UID: \"1ae9321c-ef12-4406-bcab-3c93114930ce\") " Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.277936 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ae9321c-ef12-4406-bcab-3c93114930ce-utilities" (OuterVolumeSpecName: "utilities") pod "1ae9321c-ef12-4406-bcab-3c93114930ce" (UID: "1ae9321c-ef12-4406-bcab-3c93114930ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.283561 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ae9321c-ef12-4406-bcab-3c93114930ce-kube-api-access-mjcrv" (OuterVolumeSpecName: "kube-api-access-mjcrv") pod "1ae9321c-ef12-4406-bcab-3c93114930ce" (UID: "1ae9321c-ef12-4406-bcab-3c93114930ce"). InnerVolumeSpecName "kube-api-access-mjcrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.292364 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ae9321c-ef12-4406-bcab-3c93114930ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ae9321c-ef12-4406-bcab-3c93114930ce" (UID: "1ae9321c-ef12-4406-bcab-3c93114930ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.379959 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ae9321c-ef12-4406-bcab-3c93114930ce-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.380010 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ae9321c-ef12-4406-bcab-3c93114930ce-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.380025 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjcrv\" (UniqueName: \"kubernetes.io/projected/1ae9321c-ef12-4406-bcab-3c93114930ce-kube-api-access-mjcrv\") on node \"crc\" DevicePath \"\"" Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.757547 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6d7hq" event={"ID":"1ae9321c-ef12-4406-bcab-3c93114930ce","Type":"ContainerDied","Data":"357f04782f3ecf4d900cfd86c454d246062b7580f99249c110f960b2883499d3"} Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.757605 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6d7hq" Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.757623 4835 scope.go:117] "RemoveContainer" containerID="d643c85e616bc392d610cff574069320d600a12e0e58abb1cdaea7db778553d2" Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.787284 4835 scope.go:117] "RemoveContainer" containerID="afd6e0e7b704c387faabd44a6ffd8bd4f42461020d892f750abcf65bb7841723" Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.801370 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6d7hq"] Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.821281 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6d7hq"] Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.831389 4835 scope.go:117] "RemoveContainer" containerID="b6316759f79a47d46855fbb9ee838370c66265c71b58f6e92f22e7e3053bc8be" Oct 03 19:21:28 crc kubenswrapper[4835]: I1003 19:21:28.891569 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ae9321c-ef12-4406-bcab-3c93114930ce" path="/var/lib/kubelet/pods/1ae9321c-ef12-4406-bcab-3c93114930ce/volumes" Oct 03 19:21:29 crc kubenswrapper[4835]: I1003 19:21:29.340556 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:29 crc kubenswrapper[4835]: I1003 19:21:29.341042 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:29 crc kubenswrapper[4835]: I1003 19:21:29.448063 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:30 crc kubenswrapper[4835]: I1003 19:21:30.175129 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:30 crc kubenswrapper[4835]: I1003 19:21:30.806568 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bj2ts"] Oct 03 19:21:31 crc kubenswrapper[4835]: I1003 19:21:31.795092 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bj2ts" podUID="7504d158-8ab0-443e-ae46-01fb54b9f4d3" containerName="registry-server" containerID="cri-o://aa794c6a10887566928b49fac80d7108ac3629ad4193198d86c71b5b3068155d" gracePeriod=2 Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.274555 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.394992 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7504d158-8ab0-443e-ae46-01fb54b9f4d3-utilities\") pod \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\" (UID: \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\") " Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.395248 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7504d158-8ab0-443e-ae46-01fb54b9f4d3-catalog-content\") pod \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\" (UID: \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\") " Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.395321 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnnwd\" (UniqueName: \"kubernetes.io/projected/7504d158-8ab0-443e-ae46-01fb54b9f4d3-kube-api-access-jnnwd\") pod \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\" (UID: \"7504d158-8ab0-443e-ae46-01fb54b9f4d3\") " Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.396314 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7504d158-8ab0-443e-ae46-01fb54b9f4d3-utilities" (OuterVolumeSpecName: "utilities") pod "7504d158-8ab0-443e-ae46-01fb54b9f4d3" (UID: "7504d158-8ab0-443e-ae46-01fb54b9f4d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.410178 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7504d158-8ab0-443e-ae46-01fb54b9f4d3-kube-api-access-jnnwd" (OuterVolumeSpecName: "kube-api-access-jnnwd") pod "7504d158-8ab0-443e-ae46-01fb54b9f4d3" (UID: "7504d158-8ab0-443e-ae46-01fb54b9f4d3"). InnerVolumeSpecName "kube-api-access-jnnwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.453785 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7504d158-8ab0-443e-ae46-01fb54b9f4d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7504d158-8ab0-443e-ae46-01fb54b9f4d3" (UID: "7504d158-8ab0-443e-ae46-01fb54b9f4d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.497633 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7504d158-8ab0-443e-ae46-01fb54b9f4d3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.497677 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnnwd\" (UniqueName: \"kubernetes.io/projected/7504d158-8ab0-443e-ae46-01fb54b9f4d3-kube-api-access-jnnwd\") on node \"crc\" DevicePath \"\"" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.497688 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7504d158-8ab0-443e-ae46-01fb54b9f4d3-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.805869 4835 generic.go:334] "Generic (PLEG): container finished" podID="7504d158-8ab0-443e-ae46-01fb54b9f4d3" containerID="aa794c6a10887566928b49fac80d7108ac3629ad4193198d86c71b5b3068155d" exitCode=0 Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.805932 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj2ts" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.805951 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj2ts" event={"ID":"7504d158-8ab0-443e-ae46-01fb54b9f4d3","Type":"ContainerDied","Data":"aa794c6a10887566928b49fac80d7108ac3629ad4193198d86c71b5b3068155d"} Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.807026 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj2ts" event={"ID":"7504d158-8ab0-443e-ae46-01fb54b9f4d3","Type":"ContainerDied","Data":"d577a49c2f6f5169eafdd4681b38898c3521989318ef0413cd50c107adcf859d"} Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.807054 4835 scope.go:117] "RemoveContainer" containerID="aa794c6a10887566928b49fac80d7108ac3629ad4193198d86c71b5b3068155d" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.843729 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bj2ts"] Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.844621 4835 scope.go:117] "RemoveContainer" containerID="78817c63a9dc593badeb68801a27abec9ff23a1c408ed98581300e6daac03f08" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.853584 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bj2ts"] Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.870106 4835 scope.go:117] "RemoveContainer" containerID="803374c526cd2c393a5fdb1d998571c9170bcce4768263cf235753fb9cdbb763" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.891509 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7504d158-8ab0-443e-ae46-01fb54b9f4d3" path="/var/lib/kubelet/pods/7504d158-8ab0-443e-ae46-01fb54b9f4d3/volumes" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.916044 4835 scope.go:117] "RemoveContainer" containerID="aa794c6a10887566928b49fac80d7108ac3629ad4193198d86c71b5b3068155d" Oct 03 19:21:32 crc kubenswrapper[4835]: E1003 19:21:32.916960 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa794c6a10887566928b49fac80d7108ac3629ad4193198d86c71b5b3068155d\": container with ID starting with aa794c6a10887566928b49fac80d7108ac3629ad4193198d86c71b5b3068155d not found: ID does not exist" containerID="aa794c6a10887566928b49fac80d7108ac3629ad4193198d86c71b5b3068155d" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.917004 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa794c6a10887566928b49fac80d7108ac3629ad4193198d86c71b5b3068155d"} err="failed to get container status \"aa794c6a10887566928b49fac80d7108ac3629ad4193198d86c71b5b3068155d\": rpc error: code = NotFound desc = could not find container \"aa794c6a10887566928b49fac80d7108ac3629ad4193198d86c71b5b3068155d\": container with ID starting with aa794c6a10887566928b49fac80d7108ac3629ad4193198d86c71b5b3068155d not found: ID does not exist" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.917032 4835 scope.go:117] "RemoveContainer" containerID="78817c63a9dc593badeb68801a27abec9ff23a1c408ed98581300e6daac03f08" Oct 03 19:21:32 crc kubenswrapper[4835]: E1003 19:21:32.917517 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78817c63a9dc593badeb68801a27abec9ff23a1c408ed98581300e6daac03f08\": container with ID starting with 78817c63a9dc593badeb68801a27abec9ff23a1c408ed98581300e6daac03f08 not found: ID does not exist" containerID="78817c63a9dc593badeb68801a27abec9ff23a1c408ed98581300e6daac03f08" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.917640 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78817c63a9dc593badeb68801a27abec9ff23a1c408ed98581300e6daac03f08"} err="failed to get container status \"78817c63a9dc593badeb68801a27abec9ff23a1c408ed98581300e6daac03f08\": rpc error: code = NotFound desc = could not find container \"78817c63a9dc593badeb68801a27abec9ff23a1c408ed98581300e6daac03f08\": container with ID starting with 78817c63a9dc593badeb68801a27abec9ff23a1c408ed98581300e6daac03f08 not found: ID does not exist" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.917706 4835 scope.go:117] "RemoveContainer" containerID="803374c526cd2c393a5fdb1d998571c9170bcce4768263cf235753fb9cdbb763" Oct 03 19:21:32 crc kubenswrapper[4835]: E1003 19:21:32.918111 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"803374c526cd2c393a5fdb1d998571c9170bcce4768263cf235753fb9cdbb763\": container with ID starting with 803374c526cd2c393a5fdb1d998571c9170bcce4768263cf235753fb9cdbb763 not found: ID does not exist" containerID="803374c526cd2c393a5fdb1d998571c9170bcce4768263cf235753fb9cdbb763" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.918188 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"803374c526cd2c393a5fdb1d998571c9170bcce4768263cf235753fb9cdbb763"} err="failed to get container status \"803374c526cd2c393a5fdb1d998571c9170bcce4768263cf235753fb9cdbb763\": rpc error: code = NotFound desc = could not find container \"803374c526cd2c393a5fdb1d998571c9170bcce4768263cf235753fb9cdbb763\": container with ID starting with 803374c526cd2c393a5fdb1d998571c9170bcce4768263cf235753fb9cdbb763 not found: ID does not exist" Oct 03 19:21:32 crc kubenswrapper[4835]: I1003 19:21:32.989609 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:33 crc kubenswrapper[4835]: I1003 19:21:33.039603 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:35 crc kubenswrapper[4835]: I1003 19:21:35.403231 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8pzrl"] Oct 03 19:21:35 crc kubenswrapper[4835]: I1003 19:21:35.404117 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8pzrl" podUID="d98e9af1-21a3-4f0e-b643-6ad48d24051b" containerName="registry-server" containerID="cri-o://6ca9d8ef07bd0d129039bda7d7208f9fc3c2510332994b62b06c7da9b1f150b6" gracePeriod=2 Oct 03 19:21:35 crc kubenswrapper[4835]: I1003 19:21:35.893376 4835 generic.go:334] "Generic (PLEG): container finished" podID="d98e9af1-21a3-4f0e-b643-6ad48d24051b" containerID="6ca9d8ef07bd0d129039bda7d7208f9fc3c2510332994b62b06c7da9b1f150b6" exitCode=0 Oct 03 19:21:35 crc kubenswrapper[4835]: I1003 19:21:35.893805 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pzrl" event={"ID":"d98e9af1-21a3-4f0e-b643-6ad48d24051b","Type":"ContainerDied","Data":"6ca9d8ef07bd0d129039bda7d7208f9fc3c2510332994b62b06c7da9b1f150b6"} Oct 03 19:21:35 crc kubenswrapper[4835]: I1003 19:21:35.894047 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pzrl" event={"ID":"d98e9af1-21a3-4f0e-b643-6ad48d24051b","Type":"ContainerDied","Data":"26b749b40c9d0e0fb97ac7cc95053660c8e61f8c3a0f00a71135f739b9a85d6e"} Oct 03 19:21:35 crc kubenswrapper[4835]: I1003 19:21:35.894062 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26b749b40c9d0e0fb97ac7cc95053660c8e61f8c3a0f00a71135f739b9a85d6e" Oct 03 19:21:35 crc kubenswrapper[4835]: I1003 19:21:35.942367 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:36 crc kubenswrapper[4835]: I1003 19:21:36.073587 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d98e9af1-21a3-4f0e-b643-6ad48d24051b-catalog-content\") pod \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\" (UID: \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\") " Oct 03 19:21:36 crc kubenswrapper[4835]: I1003 19:21:36.073899 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljq9d\" (UniqueName: \"kubernetes.io/projected/d98e9af1-21a3-4f0e-b643-6ad48d24051b-kube-api-access-ljq9d\") pod \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\" (UID: \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\") " Oct 03 19:21:36 crc kubenswrapper[4835]: I1003 19:21:36.074859 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d98e9af1-21a3-4f0e-b643-6ad48d24051b-utilities\") pod \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\" (UID: \"d98e9af1-21a3-4f0e-b643-6ad48d24051b\") " Oct 03 19:21:36 crc kubenswrapper[4835]: I1003 19:21:36.075509 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d98e9af1-21a3-4f0e-b643-6ad48d24051b-utilities" (OuterVolumeSpecName: "utilities") pod "d98e9af1-21a3-4f0e-b643-6ad48d24051b" (UID: "d98e9af1-21a3-4f0e-b643-6ad48d24051b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:21:36 crc kubenswrapper[4835]: I1003 19:21:36.080759 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98e9af1-21a3-4f0e-b643-6ad48d24051b-kube-api-access-ljq9d" (OuterVolumeSpecName: "kube-api-access-ljq9d") pod "d98e9af1-21a3-4f0e-b643-6ad48d24051b" (UID: "d98e9af1-21a3-4f0e-b643-6ad48d24051b"). InnerVolumeSpecName "kube-api-access-ljq9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:21:36 crc kubenswrapper[4835]: I1003 19:21:36.165589 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d98e9af1-21a3-4f0e-b643-6ad48d24051b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d98e9af1-21a3-4f0e-b643-6ad48d24051b" (UID: "d98e9af1-21a3-4f0e-b643-6ad48d24051b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:21:36 crc kubenswrapper[4835]: I1003 19:21:36.176978 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljq9d\" (UniqueName: \"kubernetes.io/projected/d98e9af1-21a3-4f0e-b643-6ad48d24051b-kube-api-access-ljq9d\") on node \"crc\" DevicePath \"\"" Oct 03 19:21:36 crc kubenswrapper[4835]: I1003 19:21:36.177014 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d98e9af1-21a3-4f0e-b643-6ad48d24051b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:21:36 crc kubenswrapper[4835]: I1003 19:21:36.177024 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d98e9af1-21a3-4f0e-b643-6ad48d24051b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:21:36 crc kubenswrapper[4835]: I1003 19:21:36.903574 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pzrl" Oct 03 19:21:36 crc kubenswrapper[4835]: I1003 19:21:36.935502 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8pzrl"] Oct 03 19:21:36 crc kubenswrapper[4835]: I1003 19:21:36.945704 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8pzrl"] Oct 03 19:21:38 crc kubenswrapper[4835]: I1003 19:21:38.893534 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d98e9af1-21a3-4f0e-b643-6ad48d24051b" path="/var/lib/kubelet/pods/d98e9af1-21a3-4f0e-b643-6ad48d24051b/volumes" Oct 03 19:22:05 crc kubenswrapper[4835]: I1003 19:22:05.358715 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:22:05 crc kubenswrapper[4835]: I1003 19:22:05.359581 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:22:35 crc kubenswrapper[4835]: I1003 19:22:35.358764 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:22:35 crc kubenswrapper[4835]: I1003 19:22:35.359575 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:23:05 crc kubenswrapper[4835]: I1003 19:23:05.359145 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:23:05 crc kubenswrapper[4835]: I1003 19:23:05.360006 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:23:05 crc kubenswrapper[4835]: I1003 19:23:05.360185 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 19:23:05 crc kubenswrapper[4835]: I1003 19:23:05.361925 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c20e7f2b0dbe3c17d0a25bc788f6b9d7f678355c89d94dd6788d1509c507bed"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 19:23:05 crc kubenswrapper[4835]: I1003 19:23:05.362020 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://7c20e7f2b0dbe3c17d0a25bc788f6b9d7f678355c89d94dd6788d1509c507bed" gracePeriod=600 Oct 03 19:23:05 crc kubenswrapper[4835]: I1003 19:23:05.817238 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="7c20e7f2b0dbe3c17d0a25bc788f6b9d7f678355c89d94dd6788d1509c507bed" exitCode=0 Oct 03 19:23:05 crc kubenswrapper[4835]: I1003 19:23:05.817298 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"7c20e7f2b0dbe3c17d0a25bc788f6b9d7f678355c89d94dd6788d1509c507bed"} Oct 03 19:23:05 crc kubenswrapper[4835]: I1003 19:23:05.817666 4835 scope.go:117] "RemoveContainer" containerID="fbd7949bfba23260d4a144fc79fc5a613dc8e86024c607124a7f626cb66572eb" Oct 03 19:23:06 crc kubenswrapper[4835]: I1003 19:23:06.830953 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc"} Oct 03 19:25:35 crc kubenswrapper[4835]: I1003 19:25:35.359318 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:25:35 crc kubenswrapper[4835]: I1003 19:25:35.360249 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:26:05 crc kubenswrapper[4835]: I1003 19:26:05.358410 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:26:05 crc kubenswrapper[4835]: I1003 19:26:05.359493 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:26:20 crc kubenswrapper[4835]: I1003 19:26:20.823647 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="71cb8688-6214-4e5e-a7da-051c5939df65" containerName="galera" probeResult="failure" output="command timed out" Oct 03 19:26:20 crc kubenswrapper[4835]: I1003 19:26:20.823694 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="71cb8688-6214-4e5e-a7da-051c5939df65" containerName="galera" probeResult="failure" output="command timed out" Oct 03 19:26:35 crc kubenswrapper[4835]: I1003 19:26:35.358771 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:26:35 crc kubenswrapper[4835]: I1003 19:26:35.359851 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:26:35 crc kubenswrapper[4835]: I1003 19:26:35.359914 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 19:26:35 crc kubenswrapper[4835]: I1003 19:26:35.361038 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 19:26:35 crc kubenswrapper[4835]: I1003 19:26:35.361136 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" gracePeriod=600 Oct 03 19:26:35 crc kubenswrapper[4835]: E1003 19:26:35.506003 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:26:36 crc kubenswrapper[4835]: I1003 19:26:36.309281 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" exitCode=0 Oct 03 19:26:36 crc kubenswrapper[4835]: I1003 19:26:36.309341 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc"} Oct 03 19:26:36 crc kubenswrapper[4835]: I1003 19:26:36.309390 4835 scope.go:117] "RemoveContainer" containerID="7c20e7f2b0dbe3c17d0a25bc788f6b9d7f678355c89d94dd6788d1509c507bed" Oct 03 19:26:36 crc kubenswrapper[4835]: I1003 19:26:36.310667 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:26:36 crc kubenswrapper[4835]: E1003 19:26:36.311217 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:26:49 crc kubenswrapper[4835]: I1003 19:26:49.878690 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:26:49 crc kubenswrapper[4835]: E1003 19:26:49.880502 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:27:04 crc kubenswrapper[4835]: I1003 19:27:04.877750 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:27:04 crc kubenswrapper[4835]: E1003 19:27:04.879241 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:27:17 crc kubenswrapper[4835]: I1003 19:27:17.878129 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:27:17 crc kubenswrapper[4835]: E1003 19:27:17.879419 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:27:30 crc kubenswrapper[4835]: I1003 19:27:30.878591 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:27:30 crc kubenswrapper[4835]: E1003 19:27:30.879624 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:27:44 crc kubenswrapper[4835]: I1003 19:27:44.276208 4835 scope.go:117] "RemoveContainer" containerID="6ca9d8ef07bd0d129039bda7d7208f9fc3c2510332994b62b06c7da9b1f150b6" Oct 03 19:27:44 crc kubenswrapper[4835]: I1003 19:27:44.306787 4835 scope.go:117] "RemoveContainer" containerID="3f63a955079d504f1749200cc9fd5101ac8258a4a20c6b1d2b95e7579628230d" Oct 03 19:27:44 crc kubenswrapper[4835]: I1003 19:27:44.653899 4835 scope.go:117] "RemoveContainer" containerID="7db82f937cd0aa7254c69998051950e27456f850a0511aae3c1b67a1fa2e9868" Oct 03 19:27:45 crc kubenswrapper[4835]: I1003 19:27:45.886065 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:27:45 crc kubenswrapper[4835]: E1003 19:27:45.887688 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:27:58 crc kubenswrapper[4835]: I1003 19:27:58.885406 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:27:58 crc kubenswrapper[4835]: E1003 19:27:58.886569 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:28:09 crc kubenswrapper[4835]: I1003 19:28:09.877330 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:28:09 crc kubenswrapper[4835]: E1003 19:28:09.878525 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:28:23 crc kubenswrapper[4835]: I1003 19:28:23.878124 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:28:23 crc kubenswrapper[4835]: E1003 19:28:23.879343 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:28:36 crc kubenswrapper[4835]: I1003 19:28:36.877096 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:28:36 crc kubenswrapper[4835]: E1003 19:28:36.878188 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:28:50 crc kubenswrapper[4835]: I1003 19:28:50.877680 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:28:50 crc kubenswrapper[4835]: E1003 19:28:50.878635 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:29:03 crc kubenswrapper[4835]: I1003 19:29:03.877810 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:29:03 crc kubenswrapper[4835]: E1003 19:29:03.879316 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:29:17 crc kubenswrapper[4835]: I1003 19:29:17.878212 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:29:17 crc kubenswrapper[4835]: E1003 19:29:17.880488 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:29:20 crc kubenswrapper[4835]: I1003 19:29:20.826330 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="71cb8688-6214-4e5e-a7da-051c5939df65" containerName="galera" probeResult="failure" output="command timed out" Oct 03 19:29:20 crc kubenswrapper[4835]: I1003 19:29:20.831395 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="71cb8688-6214-4e5e-a7da-051c5939df65" containerName="galera" probeResult="failure" output="command timed out" Oct 03 19:29:32 crc kubenswrapper[4835]: I1003 19:29:32.878037 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:29:32 crc kubenswrapper[4835]: E1003 19:29:32.880037 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:29:45 crc kubenswrapper[4835]: I1003 19:29:45.877600 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:29:45 crc kubenswrapper[4835]: E1003 19:29:45.879003 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:29:50 crc kubenswrapper[4835]: I1003 19:29:50.822016 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="71cb8688-6214-4e5e-a7da-051c5939df65" containerName="galera" probeResult="failure" output="command timed out" Oct 03 19:29:50 crc kubenswrapper[4835]: I1003 19:29:50.824326 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="71cb8688-6214-4e5e-a7da-051c5939df65" containerName="galera" probeResult="failure" output="command timed out" Oct 03 19:29:58 crc kubenswrapper[4835]: I1003 19:29:58.884980 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:29:58 crc kubenswrapper[4835]: E1003 19:29:58.886225 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.159942 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n"] Oct 03 19:30:00 crc kubenswrapper[4835]: E1003 19:30:00.160853 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae9321c-ef12-4406-bcab-3c93114930ce" containerName="extract-utilities" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.160869 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae9321c-ef12-4406-bcab-3c93114930ce" containerName="extract-utilities" Oct 03 19:30:00 crc kubenswrapper[4835]: E1003 19:30:00.160893 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae9321c-ef12-4406-bcab-3c93114930ce" containerName="registry-server" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.160901 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae9321c-ef12-4406-bcab-3c93114930ce" containerName="registry-server" Oct 03 19:30:00 crc kubenswrapper[4835]: E1003 19:30:00.160934 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98e9af1-21a3-4f0e-b643-6ad48d24051b" containerName="extract-content" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.160941 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98e9af1-21a3-4f0e-b643-6ad48d24051b" containerName="extract-content" Oct 03 19:30:00 crc kubenswrapper[4835]: E1003 19:30:00.160952 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae9321c-ef12-4406-bcab-3c93114930ce" containerName="extract-content" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.160958 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae9321c-ef12-4406-bcab-3c93114930ce" containerName="extract-content" Oct 03 19:30:00 crc kubenswrapper[4835]: E1003 19:30:00.160968 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7504d158-8ab0-443e-ae46-01fb54b9f4d3" containerName="registry-server" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.160974 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7504d158-8ab0-443e-ae46-01fb54b9f4d3" containerName="registry-server" Oct 03 19:30:00 crc kubenswrapper[4835]: E1003 19:30:00.160982 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98e9af1-21a3-4f0e-b643-6ad48d24051b" containerName="extract-utilities" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.160989 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98e9af1-21a3-4f0e-b643-6ad48d24051b" containerName="extract-utilities" Oct 03 19:30:00 crc kubenswrapper[4835]: E1003 19:30:00.160998 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7504d158-8ab0-443e-ae46-01fb54b9f4d3" containerName="extract-utilities" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.161005 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7504d158-8ab0-443e-ae46-01fb54b9f4d3" containerName="extract-utilities" Oct 03 19:30:00 crc kubenswrapper[4835]: E1003 19:30:00.161024 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7504d158-8ab0-443e-ae46-01fb54b9f4d3" containerName="extract-content" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.161030 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7504d158-8ab0-443e-ae46-01fb54b9f4d3" containerName="extract-content" Oct 03 19:30:00 crc kubenswrapper[4835]: E1003 19:30:00.161040 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98e9af1-21a3-4f0e-b643-6ad48d24051b" containerName="registry-server" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.161047 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98e9af1-21a3-4f0e-b643-6ad48d24051b" containerName="registry-server" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.161268 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98e9af1-21a3-4f0e-b643-6ad48d24051b" containerName="registry-server" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.161295 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7504d158-8ab0-443e-ae46-01fb54b9f4d3" containerName="registry-server" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.161306 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae9321c-ef12-4406-bcab-3c93114930ce" containerName="registry-server" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.162164 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.165285 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.176463 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n"] Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.194887 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.219696 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/522f2321-d321-4b1b-bcd3-f38791444afc-config-volume\") pod \"collect-profiles-29325330-dgj5n\" (UID: \"522f2321-d321-4b1b-bcd3-f38791444afc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.219789 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/522f2321-d321-4b1b-bcd3-f38791444afc-secret-volume\") pod \"collect-profiles-29325330-dgj5n\" (UID: \"522f2321-d321-4b1b-bcd3-f38791444afc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.219857 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnzks\" (UniqueName: \"kubernetes.io/projected/522f2321-d321-4b1b-bcd3-f38791444afc-kube-api-access-wnzks\") pod \"collect-profiles-29325330-dgj5n\" (UID: \"522f2321-d321-4b1b-bcd3-f38791444afc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.323006 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/522f2321-d321-4b1b-bcd3-f38791444afc-config-volume\") pod \"collect-profiles-29325330-dgj5n\" (UID: \"522f2321-d321-4b1b-bcd3-f38791444afc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.323089 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/522f2321-d321-4b1b-bcd3-f38791444afc-secret-volume\") pod \"collect-profiles-29325330-dgj5n\" (UID: \"522f2321-d321-4b1b-bcd3-f38791444afc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.323121 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnzks\" (UniqueName: \"kubernetes.io/projected/522f2321-d321-4b1b-bcd3-f38791444afc-kube-api-access-wnzks\") pod \"collect-profiles-29325330-dgj5n\" (UID: \"522f2321-d321-4b1b-bcd3-f38791444afc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.324924 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/522f2321-d321-4b1b-bcd3-f38791444afc-config-volume\") pod \"collect-profiles-29325330-dgj5n\" (UID: \"522f2321-d321-4b1b-bcd3-f38791444afc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.333051 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/522f2321-d321-4b1b-bcd3-f38791444afc-secret-volume\") pod \"collect-profiles-29325330-dgj5n\" (UID: \"522f2321-d321-4b1b-bcd3-f38791444afc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.348542 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnzks\" (UniqueName: \"kubernetes.io/projected/522f2321-d321-4b1b-bcd3-f38791444afc-kube-api-access-wnzks\") pod \"collect-profiles-29325330-dgj5n\" (UID: \"522f2321-d321-4b1b-bcd3-f38791444afc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" Oct 03 19:30:00 crc kubenswrapper[4835]: I1003 19:30:00.526887 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" Oct 03 19:30:01 crc kubenswrapper[4835]: I1003 19:30:01.037677 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n"] Oct 03 19:30:01 crc kubenswrapper[4835]: I1003 19:30:01.795309 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" event={"ID":"522f2321-d321-4b1b-bcd3-f38791444afc","Type":"ContainerStarted","Data":"959fa5ec60934a536674fb3a1c8df5ea0095d8d7abf2a1a1b8903b6d58dcd5db"} Oct 03 19:30:02 crc kubenswrapper[4835]: I1003 19:30:02.808559 4835 generic.go:334] "Generic (PLEG): container finished" podID="522f2321-d321-4b1b-bcd3-f38791444afc" containerID="23a32b4bedfd3d7dfb2953b14cc6263e51bfcdcae9614c68adda561b371f770c" exitCode=0 Oct 03 19:30:02 crc kubenswrapper[4835]: I1003 19:30:02.808633 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" event={"ID":"522f2321-d321-4b1b-bcd3-f38791444afc","Type":"ContainerDied","Data":"23a32b4bedfd3d7dfb2953b14cc6263e51bfcdcae9614c68adda561b371f770c"} Oct 03 19:30:04 crc kubenswrapper[4835]: I1003 19:30:04.572219 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" Oct 03 19:30:04 crc kubenswrapper[4835]: I1003 19:30:04.742654 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/522f2321-d321-4b1b-bcd3-f38791444afc-secret-volume\") pod \"522f2321-d321-4b1b-bcd3-f38791444afc\" (UID: \"522f2321-d321-4b1b-bcd3-f38791444afc\") " Oct 03 19:30:04 crc kubenswrapper[4835]: I1003 19:30:04.742867 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnzks\" (UniqueName: \"kubernetes.io/projected/522f2321-d321-4b1b-bcd3-f38791444afc-kube-api-access-wnzks\") pod \"522f2321-d321-4b1b-bcd3-f38791444afc\" (UID: \"522f2321-d321-4b1b-bcd3-f38791444afc\") " Oct 03 19:30:04 crc kubenswrapper[4835]: I1003 19:30:04.742948 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/522f2321-d321-4b1b-bcd3-f38791444afc-config-volume\") pod \"522f2321-d321-4b1b-bcd3-f38791444afc\" (UID: \"522f2321-d321-4b1b-bcd3-f38791444afc\") " Oct 03 19:30:04 crc kubenswrapper[4835]: I1003 19:30:04.744707 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522f2321-d321-4b1b-bcd3-f38791444afc-config-volume" (OuterVolumeSpecName: "config-volume") pod "522f2321-d321-4b1b-bcd3-f38791444afc" (UID: "522f2321-d321-4b1b-bcd3-f38791444afc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 19:30:04 crc kubenswrapper[4835]: I1003 19:30:04.745971 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/522f2321-d321-4b1b-bcd3-f38791444afc-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 19:30:04 crc kubenswrapper[4835]: I1003 19:30:04.754004 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522f2321-d321-4b1b-bcd3-f38791444afc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "522f2321-d321-4b1b-bcd3-f38791444afc" (UID: "522f2321-d321-4b1b-bcd3-f38791444afc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:30:04 crc kubenswrapper[4835]: I1003 19:30:04.761540 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/522f2321-d321-4b1b-bcd3-f38791444afc-kube-api-access-wnzks" (OuterVolumeSpecName: "kube-api-access-wnzks") pod "522f2321-d321-4b1b-bcd3-f38791444afc" (UID: "522f2321-d321-4b1b-bcd3-f38791444afc"). InnerVolumeSpecName "kube-api-access-wnzks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:30:04 crc kubenswrapper[4835]: I1003 19:30:04.838006 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" event={"ID":"522f2321-d321-4b1b-bcd3-f38791444afc","Type":"ContainerDied","Data":"959fa5ec60934a536674fb3a1c8df5ea0095d8d7abf2a1a1b8903b6d58dcd5db"} Oct 03 19:30:04 crc kubenswrapper[4835]: I1003 19:30:04.838225 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="959fa5ec60934a536674fb3a1c8df5ea0095d8d7abf2a1a1b8903b6d58dcd5db" Oct 03 19:30:04 crc kubenswrapper[4835]: I1003 19:30:04.838150 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325330-dgj5n" Oct 03 19:30:04 crc kubenswrapper[4835]: I1003 19:30:04.848178 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/522f2321-d321-4b1b-bcd3-f38791444afc-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 19:30:04 crc kubenswrapper[4835]: I1003 19:30:04.848232 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnzks\" (UniqueName: \"kubernetes.io/projected/522f2321-d321-4b1b-bcd3-f38791444afc-kube-api-access-wnzks\") on node \"crc\" DevicePath \"\"" Oct 03 19:30:05 crc kubenswrapper[4835]: I1003 19:30:05.672695 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6"] Oct 03 19:30:05 crc kubenswrapper[4835]: I1003 19:30:05.681789 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325285-gwgt6"] Oct 03 19:30:06 crc kubenswrapper[4835]: I1003 19:30:06.890044 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5af9ff5a-2dca-432e-888e-8e5e11bbabff" path="/var/lib/kubelet/pods/5af9ff5a-2dca-432e-888e-8e5e11bbabff/volumes" Oct 03 19:30:10 crc kubenswrapper[4835]: I1003 19:30:10.878029 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:30:10 crc kubenswrapper[4835]: E1003 19:30:10.878847 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:30:25 crc kubenswrapper[4835]: I1003 19:30:25.878033 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:30:25 crc kubenswrapper[4835]: E1003 19:30:25.879852 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:30:37 crc kubenswrapper[4835]: I1003 19:30:37.876996 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:30:37 crc kubenswrapper[4835]: E1003 19:30:37.880118 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:30:44 crc kubenswrapper[4835]: I1003 19:30:44.907185 4835 scope.go:117] "RemoveContainer" containerID="fe4ad2d44d3c5d719d9454f465dbdd26b3155b83916f17a834f313f4aa65c79f" Oct 03 19:30:52 crc kubenswrapper[4835]: I1003 19:30:52.876983 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:30:52 crc kubenswrapper[4835]: E1003 19:30:52.878091 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:31:05 crc kubenswrapper[4835]: I1003 19:31:05.879562 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:31:05 crc kubenswrapper[4835]: E1003 19:31:05.881420 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.082782 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-trdgw"] Oct 03 19:31:12 crc kubenswrapper[4835]: E1003 19:31:12.084502 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522f2321-d321-4b1b-bcd3-f38791444afc" containerName="collect-profiles" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.084524 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="522f2321-d321-4b1b-bcd3-f38791444afc" containerName="collect-profiles" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.084889 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="522f2321-d321-4b1b-bcd3-f38791444afc" containerName="collect-profiles" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.089390 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.094628 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trdgw"] Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.161726 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngbz8\" (UniqueName: \"kubernetes.io/projected/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-kube-api-access-ngbz8\") pod \"redhat-marketplace-trdgw\" (UID: \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\") " pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.161832 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-catalog-content\") pod \"redhat-marketplace-trdgw\" (UID: \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\") " pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.162038 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-utilities\") pod \"redhat-marketplace-trdgw\" (UID: \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\") " pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.265135 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngbz8\" (UniqueName: \"kubernetes.io/projected/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-kube-api-access-ngbz8\") pod \"redhat-marketplace-trdgw\" (UID: \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\") " pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.265678 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-catalog-content\") pod \"redhat-marketplace-trdgw\" (UID: \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\") " pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.265710 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-utilities\") pod \"redhat-marketplace-trdgw\" (UID: \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\") " pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.266263 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-utilities\") pod \"redhat-marketplace-trdgw\" (UID: \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\") " pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.266464 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-catalog-content\") pod \"redhat-marketplace-trdgw\" (UID: \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\") " pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.529306 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngbz8\" (UniqueName: \"kubernetes.io/projected/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-kube-api-access-ngbz8\") pod \"redhat-marketplace-trdgw\" (UID: \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\") " pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:12 crc kubenswrapper[4835]: I1003 19:31:12.724361 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:13 crc kubenswrapper[4835]: I1003 19:31:13.228617 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-trdgw"] Oct 03 19:31:13 crc kubenswrapper[4835]: I1003 19:31:13.686971 4835 generic.go:334] "Generic (PLEG): container finished" podID="0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" containerID="859b688a70783b938821e5c6cb68b404224c089049cf27290cbe811f331dafa8" exitCode=0 Oct 03 19:31:13 crc kubenswrapper[4835]: I1003 19:31:13.687250 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trdgw" event={"ID":"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb","Type":"ContainerDied","Data":"859b688a70783b938821e5c6cb68b404224c089049cf27290cbe811f331dafa8"} Oct 03 19:31:13 crc kubenswrapper[4835]: I1003 19:31:13.687597 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trdgw" event={"ID":"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb","Type":"ContainerStarted","Data":"e65f3c3cc614f4a8c0c5f95feade5edc913ab026baaee4e3a1fce90acfac8764"} Oct 03 19:31:13 crc kubenswrapper[4835]: I1003 19:31:13.690826 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 19:31:15 crc kubenswrapper[4835]: I1003 19:31:15.750548 4835 generic.go:334] "Generic (PLEG): container finished" podID="0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" containerID="e09c2338f0467e11eb7ce8959be5bdd4da56382b98f7e25bf49995d77213a533" exitCode=0 Oct 03 19:31:15 crc kubenswrapper[4835]: I1003 19:31:15.750662 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trdgw" event={"ID":"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb","Type":"ContainerDied","Data":"e09c2338f0467e11eb7ce8959be5bdd4da56382b98f7e25bf49995d77213a533"} Oct 03 19:31:16 crc kubenswrapper[4835]: I1003 19:31:16.765854 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trdgw" event={"ID":"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb","Type":"ContainerStarted","Data":"cf35697924979e20bc6316c48c7cc0e430bd2dcc03d49baff1cc913fc5ac6855"} Oct 03 19:31:16 crc kubenswrapper[4835]: I1003 19:31:16.796747 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-trdgw" podStartSLOduration=2.327157798 podStartE2EDuration="4.796715659s" podCreationTimestamp="2025-10-03 19:31:12 +0000 UTC" firstStartedPulling="2025-10-03 19:31:13.690292298 +0000 UTC m=+4615.406233210" lastFinishedPulling="2025-10-03 19:31:16.159850199 +0000 UTC m=+4617.875791071" observedRunningTime="2025-10-03 19:31:16.784575391 +0000 UTC m=+4618.500516283" watchObservedRunningTime="2025-10-03 19:31:16.796715659 +0000 UTC m=+4618.512656531" Oct 03 19:31:19 crc kubenswrapper[4835]: I1003 19:31:19.341716 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-29pnt"] Oct 03 19:31:19 crc kubenswrapper[4835]: I1003 19:31:19.346281 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:19 crc kubenswrapper[4835]: I1003 19:31:19.367596 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-29pnt"] Oct 03 19:31:19 crc kubenswrapper[4835]: I1003 19:31:19.454093 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2a23e4e-f43b-4990-a27c-c93d7172df27-catalog-content\") pod \"certified-operators-29pnt\" (UID: \"a2a23e4e-f43b-4990-a27c-c93d7172df27\") " pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:19 crc kubenswrapper[4835]: I1003 19:31:19.454196 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln9hb\" (UniqueName: \"kubernetes.io/projected/a2a23e4e-f43b-4990-a27c-c93d7172df27-kube-api-access-ln9hb\") pod \"certified-operators-29pnt\" (UID: \"a2a23e4e-f43b-4990-a27c-c93d7172df27\") " pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:19 crc kubenswrapper[4835]: I1003 19:31:19.454332 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2a23e4e-f43b-4990-a27c-c93d7172df27-utilities\") pod \"certified-operators-29pnt\" (UID: \"a2a23e4e-f43b-4990-a27c-c93d7172df27\") " pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:19 crc kubenswrapper[4835]: I1003 19:31:19.556530 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2a23e4e-f43b-4990-a27c-c93d7172df27-catalog-content\") pod \"certified-operators-29pnt\" (UID: \"a2a23e4e-f43b-4990-a27c-c93d7172df27\") " pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:19 crc kubenswrapper[4835]: I1003 19:31:19.556606 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln9hb\" (UniqueName: \"kubernetes.io/projected/a2a23e4e-f43b-4990-a27c-c93d7172df27-kube-api-access-ln9hb\") pod \"certified-operators-29pnt\" (UID: \"a2a23e4e-f43b-4990-a27c-c93d7172df27\") " pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:19 crc kubenswrapper[4835]: I1003 19:31:19.556682 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2a23e4e-f43b-4990-a27c-c93d7172df27-utilities\") pod \"certified-operators-29pnt\" (UID: \"a2a23e4e-f43b-4990-a27c-c93d7172df27\") " pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:19 crc kubenswrapper[4835]: I1003 19:31:19.557587 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2a23e4e-f43b-4990-a27c-c93d7172df27-utilities\") pod \"certified-operators-29pnt\" (UID: \"a2a23e4e-f43b-4990-a27c-c93d7172df27\") " pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:19 crc kubenswrapper[4835]: I1003 19:31:19.557647 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2a23e4e-f43b-4990-a27c-c93d7172df27-catalog-content\") pod \"certified-operators-29pnt\" (UID: \"a2a23e4e-f43b-4990-a27c-c93d7172df27\") " pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:19 crc kubenswrapper[4835]: I1003 19:31:19.580166 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln9hb\" (UniqueName: \"kubernetes.io/projected/a2a23e4e-f43b-4990-a27c-c93d7172df27-kube-api-access-ln9hb\") pod \"certified-operators-29pnt\" (UID: \"a2a23e4e-f43b-4990-a27c-c93d7172df27\") " pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:19 crc kubenswrapper[4835]: I1003 19:31:19.670311 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:20 crc kubenswrapper[4835]: I1003 19:31:20.053722 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-29pnt"] Oct 03 19:31:20 crc kubenswrapper[4835]: W1003 19:31:20.055711 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2a23e4e_f43b_4990_a27c_c93d7172df27.slice/crio-4aafd1d7550566f02d240c567afaa8c61b5305eacf4d6fde5799b75f3272c693 WatchSource:0}: Error finding container 4aafd1d7550566f02d240c567afaa8c61b5305eacf4d6fde5799b75f3272c693: Status 404 returned error can't find the container with id 4aafd1d7550566f02d240c567afaa8c61b5305eacf4d6fde5799b75f3272c693 Oct 03 19:31:20 crc kubenswrapper[4835]: I1003 19:31:20.816079 4835 generic.go:334] "Generic (PLEG): container finished" podID="a2a23e4e-f43b-4990-a27c-c93d7172df27" containerID="3513aeb8d0a94d9f5761b92e1f6d256b8bb0d73ce0a614313e10332de9e9eac3" exitCode=0 Oct 03 19:31:20 crc kubenswrapper[4835]: I1003 19:31:20.816209 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29pnt" event={"ID":"a2a23e4e-f43b-4990-a27c-c93d7172df27","Type":"ContainerDied","Data":"3513aeb8d0a94d9f5761b92e1f6d256b8bb0d73ce0a614313e10332de9e9eac3"} Oct 03 19:31:20 crc kubenswrapper[4835]: I1003 19:31:20.816537 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29pnt" event={"ID":"a2a23e4e-f43b-4990-a27c-c93d7172df27","Type":"ContainerStarted","Data":"4aafd1d7550566f02d240c567afaa8c61b5305eacf4d6fde5799b75f3272c693"} Oct 03 19:31:20 crc kubenswrapper[4835]: I1003 19:31:20.880981 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:31:20 crc kubenswrapper[4835]: E1003 19:31:20.881216 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:31:21 crc kubenswrapper[4835]: I1003 19:31:21.837565 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29pnt" event={"ID":"a2a23e4e-f43b-4990-a27c-c93d7172df27","Type":"ContainerStarted","Data":"b5c1ca1dc073b741c8c9f1bc74a49db1a0e7aa1a50be36cb2de084e4a5f8a73c"} Oct 03 19:31:22 crc kubenswrapper[4835]: I1003 19:31:22.724931 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:22 crc kubenswrapper[4835]: I1003 19:31:22.725015 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:22 crc kubenswrapper[4835]: I1003 19:31:22.792837 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:22 crc kubenswrapper[4835]: I1003 19:31:22.856997 4835 generic.go:334] "Generic (PLEG): container finished" podID="a2a23e4e-f43b-4990-a27c-c93d7172df27" containerID="b5c1ca1dc073b741c8c9f1bc74a49db1a0e7aa1a50be36cb2de084e4a5f8a73c" exitCode=0 Oct 03 19:31:22 crc kubenswrapper[4835]: I1003 19:31:22.857130 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29pnt" event={"ID":"a2a23e4e-f43b-4990-a27c-c93d7172df27","Type":"ContainerDied","Data":"b5c1ca1dc073b741c8c9f1bc74a49db1a0e7aa1a50be36cb2de084e4a5f8a73c"} Oct 03 19:31:22 crc kubenswrapper[4835]: I1003 19:31:22.929164 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:23 crc kubenswrapper[4835]: I1003 19:31:23.894620 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29pnt" event={"ID":"a2a23e4e-f43b-4990-a27c-c93d7172df27","Type":"ContainerStarted","Data":"3ffa0a65790a5d03f37866baff2df57b57939f91456bbf0ad5c764f0a7c5622d"} Oct 03 19:31:23 crc kubenswrapper[4835]: I1003 19:31:23.928678 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-29pnt" podStartSLOduration=2.41760193 podStartE2EDuration="4.928642409s" podCreationTimestamp="2025-10-03 19:31:19 +0000 UTC" firstStartedPulling="2025-10-03 19:31:20.818410544 +0000 UTC m=+4622.534351416" lastFinishedPulling="2025-10-03 19:31:23.329451023 +0000 UTC m=+4625.045391895" observedRunningTime="2025-10-03 19:31:23.916011118 +0000 UTC m=+4625.631951990" watchObservedRunningTime="2025-10-03 19:31:23.928642409 +0000 UTC m=+4625.644583281" Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.118324 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-trdgw"] Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.118626 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-trdgw" podUID="0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" containerName="registry-server" containerID="cri-o://cf35697924979e20bc6316c48c7cc0e430bd2dcc03d49baff1cc913fc5ac6855" gracePeriod=2 Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.667953 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.826740 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngbz8\" (UniqueName: \"kubernetes.io/projected/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-kube-api-access-ngbz8\") pod \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\" (UID: \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\") " Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.827346 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-utilities\") pod \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\" (UID: \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\") " Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.827485 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-catalog-content\") pod \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\" (UID: \"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb\") " Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.828592 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-utilities" (OuterVolumeSpecName: "utilities") pod "0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" (UID: "0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.837611 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-kube-api-access-ngbz8" (OuterVolumeSpecName: "kube-api-access-ngbz8") pod "0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" (UID: "0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb"). InnerVolumeSpecName "kube-api-access-ngbz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.842095 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" (UID: "0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.920770 4835 generic.go:334] "Generic (PLEG): container finished" podID="0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" containerID="cf35697924979e20bc6316c48c7cc0e430bd2dcc03d49baff1cc913fc5ac6855" exitCode=0 Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.920831 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trdgw" event={"ID":"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb","Type":"ContainerDied","Data":"cf35697924979e20bc6316c48c7cc0e430bd2dcc03d49baff1cc913fc5ac6855"} Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.920884 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-trdgw" event={"ID":"0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb","Type":"ContainerDied","Data":"e65f3c3cc614f4a8c0c5f95feade5edc913ab026baaee4e3a1fce90acfac8764"} Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.920911 4835 scope.go:117] "RemoveContainer" containerID="cf35697924979e20bc6316c48c7cc0e430bd2dcc03d49baff1cc913fc5ac6855" Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.920933 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-trdgw" Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.930390 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngbz8\" (UniqueName: \"kubernetes.io/projected/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-kube-api-access-ngbz8\") on node \"crc\" DevicePath \"\"" Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.930443 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.930458 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.966871 4835 scope.go:117] "RemoveContainer" containerID="e09c2338f0467e11eb7ce8959be5bdd4da56382b98f7e25bf49995d77213a533" Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.972013 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-trdgw"] Oct 03 19:31:25 crc kubenswrapper[4835]: I1003 19:31:25.988194 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-trdgw"] Oct 03 19:31:26 crc kubenswrapper[4835]: I1003 19:31:26.003991 4835 scope.go:117] "RemoveContainer" containerID="859b688a70783b938821e5c6cb68b404224c089049cf27290cbe811f331dafa8" Oct 03 19:31:26 crc kubenswrapper[4835]: I1003 19:31:26.046584 4835 scope.go:117] "RemoveContainer" containerID="cf35697924979e20bc6316c48c7cc0e430bd2dcc03d49baff1cc913fc5ac6855" Oct 03 19:31:26 crc kubenswrapper[4835]: E1003 19:31:26.047273 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf35697924979e20bc6316c48c7cc0e430bd2dcc03d49baff1cc913fc5ac6855\": container with ID starting with cf35697924979e20bc6316c48c7cc0e430bd2dcc03d49baff1cc913fc5ac6855 not found: ID does not exist" containerID="cf35697924979e20bc6316c48c7cc0e430bd2dcc03d49baff1cc913fc5ac6855" Oct 03 19:31:26 crc kubenswrapper[4835]: I1003 19:31:26.047315 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf35697924979e20bc6316c48c7cc0e430bd2dcc03d49baff1cc913fc5ac6855"} err="failed to get container status \"cf35697924979e20bc6316c48c7cc0e430bd2dcc03d49baff1cc913fc5ac6855\": rpc error: code = NotFound desc = could not find container \"cf35697924979e20bc6316c48c7cc0e430bd2dcc03d49baff1cc913fc5ac6855\": container with ID starting with cf35697924979e20bc6316c48c7cc0e430bd2dcc03d49baff1cc913fc5ac6855 not found: ID does not exist" Oct 03 19:31:26 crc kubenswrapper[4835]: I1003 19:31:26.047344 4835 scope.go:117] "RemoveContainer" containerID="e09c2338f0467e11eb7ce8959be5bdd4da56382b98f7e25bf49995d77213a533" Oct 03 19:31:26 crc kubenswrapper[4835]: E1003 19:31:26.053628 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e09c2338f0467e11eb7ce8959be5bdd4da56382b98f7e25bf49995d77213a533\": container with ID starting with e09c2338f0467e11eb7ce8959be5bdd4da56382b98f7e25bf49995d77213a533 not found: ID does not exist" containerID="e09c2338f0467e11eb7ce8959be5bdd4da56382b98f7e25bf49995d77213a533" Oct 03 19:31:26 crc kubenswrapper[4835]: I1003 19:31:26.053689 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e09c2338f0467e11eb7ce8959be5bdd4da56382b98f7e25bf49995d77213a533"} err="failed to get container status \"e09c2338f0467e11eb7ce8959be5bdd4da56382b98f7e25bf49995d77213a533\": rpc error: code = NotFound desc = could not find container \"e09c2338f0467e11eb7ce8959be5bdd4da56382b98f7e25bf49995d77213a533\": container with ID starting with e09c2338f0467e11eb7ce8959be5bdd4da56382b98f7e25bf49995d77213a533 not found: ID does not exist" Oct 03 19:31:26 crc kubenswrapper[4835]: I1003 19:31:26.053724 4835 scope.go:117] "RemoveContainer" containerID="859b688a70783b938821e5c6cb68b404224c089049cf27290cbe811f331dafa8" Oct 03 19:31:26 crc kubenswrapper[4835]: E1003 19:31:26.054035 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859b688a70783b938821e5c6cb68b404224c089049cf27290cbe811f331dafa8\": container with ID starting with 859b688a70783b938821e5c6cb68b404224c089049cf27290cbe811f331dafa8 not found: ID does not exist" containerID="859b688a70783b938821e5c6cb68b404224c089049cf27290cbe811f331dafa8" Oct 03 19:31:26 crc kubenswrapper[4835]: I1003 19:31:26.054084 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859b688a70783b938821e5c6cb68b404224c089049cf27290cbe811f331dafa8"} err="failed to get container status \"859b688a70783b938821e5c6cb68b404224c089049cf27290cbe811f331dafa8\": rpc error: code = NotFound desc = could not find container \"859b688a70783b938821e5c6cb68b404224c089049cf27290cbe811f331dafa8\": container with ID starting with 859b688a70783b938821e5c6cb68b404224c089049cf27290cbe811f331dafa8 not found: ID does not exist" Oct 03 19:31:26 crc kubenswrapper[4835]: I1003 19:31:26.892101 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" path="/var/lib/kubelet/pods/0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb/volumes" Oct 03 19:31:29 crc kubenswrapper[4835]: I1003 19:31:29.671486 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:29 crc kubenswrapper[4835]: I1003 19:31:29.672470 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:29 crc kubenswrapper[4835]: I1003 19:31:29.724688 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:30 crc kubenswrapper[4835]: I1003 19:31:30.047617 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:30 crc kubenswrapper[4835]: I1003 19:31:30.123984 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-29pnt"] Oct 03 19:31:32 crc kubenswrapper[4835]: I1003 19:31:32.014914 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-29pnt" podUID="a2a23e4e-f43b-4990-a27c-c93d7172df27" containerName="registry-server" containerID="cri-o://3ffa0a65790a5d03f37866baff2df57b57939f91456bbf0ad5c764f0a7c5622d" gracePeriod=2 Oct 03 19:31:32 crc kubenswrapper[4835]: I1003 19:31:32.522768 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:32 crc kubenswrapper[4835]: I1003 19:31:32.594527 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2a23e4e-f43b-4990-a27c-c93d7172df27-catalog-content\") pod \"a2a23e4e-f43b-4990-a27c-c93d7172df27\" (UID: \"a2a23e4e-f43b-4990-a27c-c93d7172df27\") " Oct 03 19:31:32 crc kubenswrapper[4835]: I1003 19:31:32.594601 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2a23e4e-f43b-4990-a27c-c93d7172df27-utilities\") pod \"a2a23e4e-f43b-4990-a27c-c93d7172df27\" (UID: \"a2a23e4e-f43b-4990-a27c-c93d7172df27\") " Oct 03 19:31:32 crc kubenswrapper[4835]: I1003 19:31:32.594724 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln9hb\" (UniqueName: \"kubernetes.io/projected/a2a23e4e-f43b-4990-a27c-c93d7172df27-kube-api-access-ln9hb\") pod \"a2a23e4e-f43b-4990-a27c-c93d7172df27\" (UID: \"a2a23e4e-f43b-4990-a27c-c93d7172df27\") " Oct 03 19:31:32 crc kubenswrapper[4835]: I1003 19:31:32.597475 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2a23e4e-f43b-4990-a27c-c93d7172df27-utilities" (OuterVolumeSpecName: "utilities") pod "a2a23e4e-f43b-4990-a27c-c93d7172df27" (UID: "a2a23e4e-f43b-4990-a27c-c93d7172df27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:31:32 crc kubenswrapper[4835]: I1003 19:31:32.607805 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2a23e4e-f43b-4990-a27c-c93d7172df27-kube-api-access-ln9hb" (OuterVolumeSpecName: "kube-api-access-ln9hb") pod "a2a23e4e-f43b-4990-a27c-c93d7172df27" (UID: "a2a23e4e-f43b-4990-a27c-c93d7172df27"). InnerVolumeSpecName "kube-api-access-ln9hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:31:32 crc kubenswrapper[4835]: I1003 19:31:32.665858 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2a23e4e-f43b-4990-a27c-c93d7172df27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2a23e4e-f43b-4990-a27c-c93d7172df27" (UID: "a2a23e4e-f43b-4990-a27c-c93d7172df27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:31:32 crc kubenswrapper[4835]: I1003 19:31:32.698552 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2a23e4e-f43b-4990-a27c-c93d7172df27-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:31:32 crc kubenswrapper[4835]: I1003 19:31:32.698605 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2a23e4e-f43b-4990-a27c-c93d7172df27-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:31:32 crc kubenswrapper[4835]: I1003 19:31:32.698616 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln9hb\" (UniqueName: \"kubernetes.io/projected/a2a23e4e-f43b-4990-a27c-c93d7172df27-kube-api-access-ln9hb\") on node \"crc\" DevicePath \"\"" Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.029836 4835 generic.go:334] "Generic (PLEG): container finished" podID="a2a23e4e-f43b-4990-a27c-c93d7172df27" containerID="3ffa0a65790a5d03f37866baff2df57b57939f91456bbf0ad5c764f0a7c5622d" exitCode=0 Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.029902 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29pnt" event={"ID":"a2a23e4e-f43b-4990-a27c-c93d7172df27","Type":"ContainerDied","Data":"3ffa0a65790a5d03f37866baff2df57b57939f91456bbf0ad5c764f0a7c5622d"} Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.029937 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29pnt" event={"ID":"a2a23e4e-f43b-4990-a27c-c93d7172df27","Type":"ContainerDied","Data":"4aafd1d7550566f02d240c567afaa8c61b5305eacf4d6fde5799b75f3272c693"} Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.029966 4835 scope.go:117] "RemoveContainer" containerID="3ffa0a65790a5d03f37866baff2df57b57939f91456bbf0ad5c764f0a7c5622d" Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.030175 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29pnt" Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.066014 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-29pnt"] Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.070561 4835 scope.go:117] "RemoveContainer" containerID="b5c1ca1dc073b741c8c9f1bc74a49db1a0e7aa1a50be36cb2de084e4a5f8a73c" Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.090418 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-29pnt"] Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.100310 4835 scope.go:117] "RemoveContainer" containerID="3513aeb8d0a94d9f5761b92e1f6d256b8bb0d73ce0a614313e10332de9e9eac3" Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.172364 4835 scope.go:117] "RemoveContainer" containerID="3ffa0a65790a5d03f37866baff2df57b57939f91456bbf0ad5c764f0a7c5622d" Oct 03 19:31:33 crc kubenswrapper[4835]: E1003 19:31:33.173122 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ffa0a65790a5d03f37866baff2df57b57939f91456bbf0ad5c764f0a7c5622d\": container with ID starting with 3ffa0a65790a5d03f37866baff2df57b57939f91456bbf0ad5c764f0a7c5622d not found: ID does not exist" containerID="3ffa0a65790a5d03f37866baff2df57b57939f91456bbf0ad5c764f0a7c5622d" Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.173179 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ffa0a65790a5d03f37866baff2df57b57939f91456bbf0ad5c764f0a7c5622d"} err="failed to get container status \"3ffa0a65790a5d03f37866baff2df57b57939f91456bbf0ad5c764f0a7c5622d\": rpc error: code = NotFound desc = could not find container \"3ffa0a65790a5d03f37866baff2df57b57939f91456bbf0ad5c764f0a7c5622d\": container with ID starting with 3ffa0a65790a5d03f37866baff2df57b57939f91456bbf0ad5c764f0a7c5622d not found: ID does not exist" Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.173249 4835 scope.go:117] "RemoveContainer" containerID="b5c1ca1dc073b741c8c9f1bc74a49db1a0e7aa1a50be36cb2de084e4a5f8a73c" Oct 03 19:31:33 crc kubenswrapper[4835]: E1003 19:31:33.174014 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c1ca1dc073b741c8c9f1bc74a49db1a0e7aa1a50be36cb2de084e4a5f8a73c\": container with ID starting with b5c1ca1dc073b741c8c9f1bc74a49db1a0e7aa1a50be36cb2de084e4a5f8a73c not found: ID does not exist" containerID="b5c1ca1dc073b741c8c9f1bc74a49db1a0e7aa1a50be36cb2de084e4a5f8a73c" Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.174259 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c1ca1dc073b741c8c9f1bc74a49db1a0e7aa1a50be36cb2de084e4a5f8a73c"} err="failed to get container status \"b5c1ca1dc073b741c8c9f1bc74a49db1a0e7aa1a50be36cb2de084e4a5f8a73c\": rpc error: code = NotFound desc = could not find container \"b5c1ca1dc073b741c8c9f1bc74a49db1a0e7aa1a50be36cb2de084e4a5f8a73c\": container with ID starting with b5c1ca1dc073b741c8c9f1bc74a49db1a0e7aa1a50be36cb2de084e4a5f8a73c not found: ID does not exist" Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.174356 4835 scope.go:117] "RemoveContainer" containerID="3513aeb8d0a94d9f5761b92e1f6d256b8bb0d73ce0a614313e10332de9e9eac3" Oct 03 19:31:33 crc kubenswrapper[4835]: E1003 19:31:33.174817 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3513aeb8d0a94d9f5761b92e1f6d256b8bb0d73ce0a614313e10332de9e9eac3\": container with ID starting with 3513aeb8d0a94d9f5761b92e1f6d256b8bb0d73ce0a614313e10332de9e9eac3 not found: ID does not exist" containerID="3513aeb8d0a94d9f5761b92e1f6d256b8bb0d73ce0a614313e10332de9e9eac3" Oct 03 19:31:33 crc kubenswrapper[4835]: I1003 19:31:33.174845 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3513aeb8d0a94d9f5761b92e1f6d256b8bb0d73ce0a614313e10332de9e9eac3"} err="failed to get container status \"3513aeb8d0a94d9f5761b92e1f6d256b8bb0d73ce0a614313e10332de9e9eac3\": rpc error: code = NotFound desc = could not find container \"3513aeb8d0a94d9f5761b92e1f6d256b8bb0d73ce0a614313e10332de9e9eac3\": container with ID starting with 3513aeb8d0a94d9f5761b92e1f6d256b8bb0d73ce0a614313e10332de9e9eac3 not found: ID does not exist" Oct 03 19:31:34 crc kubenswrapper[4835]: I1003 19:31:34.877666 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:31:34 crc kubenswrapper[4835]: E1003 19:31:34.878119 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:31:34 crc kubenswrapper[4835]: I1003 19:31:34.891781 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2a23e4e-f43b-4990-a27c-c93d7172df27" path="/var/lib/kubelet/pods/a2a23e4e-f43b-4990-a27c-c93d7172df27/volumes" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.103425 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2g5kk"] Oct 03 19:31:42 crc kubenswrapper[4835]: E1003 19:31:42.104957 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" containerName="extract-utilities" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.104975 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" containerName="extract-utilities" Oct 03 19:31:42 crc kubenswrapper[4835]: E1003 19:31:42.105032 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2a23e4e-f43b-4990-a27c-c93d7172df27" containerName="extract-utilities" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.105039 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2a23e4e-f43b-4990-a27c-c93d7172df27" containerName="extract-utilities" Oct 03 19:31:42 crc kubenswrapper[4835]: E1003 19:31:42.105062 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" containerName="extract-content" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.105085 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" containerName="extract-content" Oct 03 19:31:42 crc kubenswrapper[4835]: E1003 19:31:42.105098 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" containerName="registry-server" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.105104 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" containerName="registry-server" Oct 03 19:31:42 crc kubenswrapper[4835]: E1003 19:31:42.105112 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2a23e4e-f43b-4990-a27c-c93d7172df27" containerName="extract-content" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.105120 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2a23e4e-f43b-4990-a27c-c93d7172df27" containerName="extract-content" Oct 03 19:31:42 crc kubenswrapper[4835]: E1003 19:31:42.105129 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2a23e4e-f43b-4990-a27c-c93d7172df27" containerName="registry-server" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.105136 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2a23e4e-f43b-4990-a27c-c93d7172df27" containerName="registry-server" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.105340 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2a23e4e-f43b-4990-a27c-c93d7172df27" containerName="registry-server" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.105359 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da0e0d5-2a5e-49b8-a9cd-0d0f7a17a5fb" containerName="registry-server" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.107033 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.114397 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2g5kk"] Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.232634 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l27c\" (UniqueName: \"kubernetes.io/projected/1e9f595a-d3de-4f71-a375-187f905f834a-kube-api-access-8l27c\") pod \"redhat-operators-2g5kk\" (UID: \"1e9f595a-d3de-4f71-a375-187f905f834a\") " pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.232830 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9f595a-d3de-4f71-a375-187f905f834a-utilities\") pod \"redhat-operators-2g5kk\" (UID: \"1e9f595a-d3de-4f71-a375-187f905f834a\") " pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.232865 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9f595a-d3de-4f71-a375-187f905f834a-catalog-content\") pod \"redhat-operators-2g5kk\" (UID: \"1e9f595a-d3de-4f71-a375-187f905f834a\") " pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.334973 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l27c\" (UniqueName: \"kubernetes.io/projected/1e9f595a-d3de-4f71-a375-187f905f834a-kube-api-access-8l27c\") pod \"redhat-operators-2g5kk\" (UID: \"1e9f595a-d3de-4f71-a375-187f905f834a\") " pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.335124 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9f595a-d3de-4f71-a375-187f905f834a-utilities\") pod \"redhat-operators-2g5kk\" (UID: \"1e9f595a-d3de-4f71-a375-187f905f834a\") " pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.335155 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9f595a-d3de-4f71-a375-187f905f834a-catalog-content\") pod \"redhat-operators-2g5kk\" (UID: \"1e9f595a-d3de-4f71-a375-187f905f834a\") " pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.335715 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9f595a-d3de-4f71-a375-187f905f834a-catalog-content\") pod \"redhat-operators-2g5kk\" (UID: \"1e9f595a-d3de-4f71-a375-187f905f834a\") " pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.335751 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9f595a-d3de-4f71-a375-187f905f834a-utilities\") pod \"redhat-operators-2g5kk\" (UID: \"1e9f595a-d3de-4f71-a375-187f905f834a\") " pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.384047 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l27c\" (UniqueName: \"kubernetes.io/projected/1e9f595a-d3de-4f71-a375-187f905f834a-kube-api-access-8l27c\") pod \"redhat-operators-2g5kk\" (UID: \"1e9f595a-d3de-4f71-a375-187f905f834a\") " pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:42 crc kubenswrapper[4835]: I1003 19:31:42.436977 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:43 crc kubenswrapper[4835]: I1003 19:31:43.046799 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2g5kk"] Oct 03 19:31:43 crc kubenswrapper[4835]: I1003 19:31:43.194111 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g5kk" event={"ID":"1e9f595a-d3de-4f71-a375-187f905f834a","Type":"ContainerStarted","Data":"aeee95f6b8ffa4d542685def145981e3acbab69db04b684236bfadb21640dfec"} Oct 03 19:31:44 crc kubenswrapper[4835]: I1003 19:31:44.209237 4835 generic.go:334] "Generic (PLEG): container finished" podID="1e9f595a-d3de-4f71-a375-187f905f834a" containerID="8b3d90d845d9ab8d6b2f47017505434a48fbc8326487b7c095bca2d931dcfabf" exitCode=0 Oct 03 19:31:44 crc kubenswrapper[4835]: I1003 19:31:44.209333 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g5kk" event={"ID":"1e9f595a-d3de-4f71-a375-187f905f834a","Type":"ContainerDied","Data":"8b3d90d845d9ab8d6b2f47017505434a48fbc8326487b7c095bca2d931dcfabf"} Oct 03 19:31:46 crc kubenswrapper[4835]: I1003 19:31:46.250239 4835 generic.go:334] "Generic (PLEG): container finished" podID="1e9f595a-d3de-4f71-a375-187f905f834a" containerID="8ef3543f21b0b74f87fc373e47a8cd0f1f6e41a630f8d29d1e7fd2a909bfe788" exitCode=0 Oct 03 19:31:46 crc kubenswrapper[4835]: I1003 19:31:46.250549 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g5kk" event={"ID":"1e9f595a-d3de-4f71-a375-187f905f834a","Type":"ContainerDied","Data":"8ef3543f21b0b74f87fc373e47a8cd0f1f6e41a630f8d29d1e7fd2a909bfe788"} Oct 03 19:31:47 crc kubenswrapper[4835]: I1003 19:31:47.267011 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g5kk" event={"ID":"1e9f595a-d3de-4f71-a375-187f905f834a","Type":"ContainerStarted","Data":"c33acb0100a624f65e9200e8ea97b82edeea2571024877e67554fa7a6d61dd01"} Oct 03 19:31:47 crc kubenswrapper[4835]: I1003 19:31:47.297373 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2g5kk" podStartSLOduration=2.658220365 podStartE2EDuration="5.297350942s" podCreationTimestamp="2025-10-03 19:31:42 +0000 UTC" firstStartedPulling="2025-10-03 19:31:44.212520172 +0000 UTC m=+4645.928461044" lastFinishedPulling="2025-10-03 19:31:46.851650749 +0000 UTC m=+4648.567591621" observedRunningTime="2025-10-03 19:31:47.286746622 +0000 UTC m=+4649.002687494" watchObservedRunningTime="2025-10-03 19:31:47.297350942 +0000 UTC m=+4649.013291814" Oct 03 19:31:48 crc kubenswrapper[4835]: I1003 19:31:48.884911 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:31:49 crc kubenswrapper[4835]: I1003 19:31:49.328984 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"1a8319c7829fb9a8b7febda9558f8244427ce5144e79867d556f18cceb6475d4"} Oct 03 19:31:52 crc kubenswrapper[4835]: I1003 19:31:52.437347 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:52 crc kubenswrapper[4835]: I1003 19:31:52.437749 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:52 crc kubenswrapper[4835]: I1003 19:31:52.492505 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:53 crc kubenswrapper[4835]: I1003 19:31:53.456935 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:53 crc kubenswrapper[4835]: I1003 19:31:53.519336 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2g5kk"] Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.158675 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9j6m7"] Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.162056 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.190365 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9j6m7"] Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.287429 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/130d7ad5-049e-4bd8-9029-23ccbce73b7e-utilities\") pod \"community-operators-9j6m7\" (UID: \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\") " pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.287520 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/130d7ad5-049e-4bd8-9029-23ccbce73b7e-catalog-content\") pod \"community-operators-9j6m7\" (UID: \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\") " pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.287555 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56fkp\" (UniqueName: \"kubernetes.io/projected/130d7ad5-049e-4bd8-9029-23ccbce73b7e-kube-api-access-56fkp\") pod \"community-operators-9j6m7\" (UID: \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\") " pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.390574 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/130d7ad5-049e-4bd8-9029-23ccbce73b7e-utilities\") pod \"community-operators-9j6m7\" (UID: \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\") " pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.390634 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/130d7ad5-049e-4bd8-9029-23ccbce73b7e-catalog-content\") pod \"community-operators-9j6m7\" (UID: \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\") " pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.390676 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56fkp\" (UniqueName: \"kubernetes.io/projected/130d7ad5-049e-4bd8-9029-23ccbce73b7e-kube-api-access-56fkp\") pod \"community-operators-9j6m7\" (UID: \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\") " pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.391387 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/130d7ad5-049e-4bd8-9029-23ccbce73b7e-utilities\") pod \"community-operators-9j6m7\" (UID: \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\") " pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.391831 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/130d7ad5-049e-4bd8-9029-23ccbce73b7e-catalog-content\") pod \"community-operators-9j6m7\" (UID: \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\") " pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.412044 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2g5kk" podUID="1e9f595a-d3de-4f71-a375-187f905f834a" containerName="registry-server" containerID="cri-o://c33acb0100a624f65e9200e8ea97b82edeea2571024877e67554fa7a6d61dd01" gracePeriod=2 Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.416660 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56fkp\" (UniqueName: \"kubernetes.io/projected/130d7ad5-049e-4bd8-9029-23ccbce73b7e-kube-api-access-56fkp\") pod \"community-operators-9j6m7\" (UID: \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\") " pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:31:55 crc kubenswrapper[4835]: I1003 19:31:55.501358 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.233446 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.252493 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9j6m7"] Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.329296 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l27c\" (UniqueName: \"kubernetes.io/projected/1e9f595a-d3de-4f71-a375-187f905f834a-kube-api-access-8l27c\") pod \"1e9f595a-d3de-4f71-a375-187f905f834a\" (UID: \"1e9f595a-d3de-4f71-a375-187f905f834a\") " Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.329573 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9f595a-d3de-4f71-a375-187f905f834a-catalog-content\") pod \"1e9f595a-d3de-4f71-a375-187f905f834a\" (UID: \"1e9f595a-d3de-4f71-a375-187f905f834a\") " Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.329689 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9f595a-d3de-4f71-a375-187f905f834a-utilities\") pod \"1e9f595a-d3de-4f71-a375-187f905f834a\" (UID: \"1e9f595a-d3de-4f71-a375-187f905f834a\") " Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.330636 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e9f595a-d3de-4f71-a375-187f905f834a-utilities" (OuterVolumeSpecName: "utilities") pod "1e9f595a-d3de-4f71-a375-187f905f834a" (UID: "1e9f595a-d3de-4f71-a375-187f905f834a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.340204 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9f595a-d3de-4f71-a375-187f905f834a-kube-api-access-8l27c" (OuterVolumeSpecName: "kube-api-access-8l27c") pod "1e9f595a-d3de-4f71-a375-187f905f834a" (UID: "1e9f595a-d3de-4f71-a375-187f905f834a"). InnerVolumeSpecName "kube-api-access-8l27c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.410660 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e9f595a-d3de-4f71-a375-187f905f834a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e9f595a-d3de-4f71-a375-187f905f834a" (UID: "1e9f595a-d3de-4f71-a375-187f905f834a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.434287 4835 generic.go:334] "Generic (PLEG): container finished" podID="1e9f595a-d3de-4f71-a375-187f905f834a" containerID="c33acb0100a624f65e9200e8ea97b82edeea2571024877e67554fa7a6d61dd01" exitCode=0 Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.434563 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g5kk" event={"ID":"1e9f595a-d3de-4f71-a375-187f905f834a","Type":"ContainerDied","Data":"c33acb0100a624f65e9200e8ea97b82edeea2571024877e67554fa7a6d61dd01"} Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.434989 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g5kk" event={"ID":"1e9f595a-d3de-4f71-a375-187f905f834a","Type":"ContainerDied","Data":"aeee95f6b8ffa4d542685def145981e3acbab69db04b684236bfadb21640dfec"} Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.435021 4835 scope.go:117] "RemoveContainer" containerID="c33acb0100a624f65e9200e8ea97b82edeea2571024877e67554fa7a6d61dd01" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.434589 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g5kk" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.435989 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l27c\" (UniqueName: \"kubernetes.io/projected/1e9f595a-d3de-4f71-a375-187f905f834a-kube-api-access-8l27c\") on node \"crc\" DevicePath \"\"" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.436026 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9f595a-d3de-4f71-a375-187f905f834a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.436039 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9f595a-d3de-4f71-a375-187f905f834a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.440811 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9j6m7" event={"ID":"130d7ad5-049e-4bd8-9029-23ccbce73b7e","Type":"ContainerStarted","Data":"0bb54a5a75969781acf1e3cff8a2c3da4f4c50fed3211289f90489b38c5a9f11"} Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.501317 4835 scope.go:117] "RemoveContainer" containerID="8ef3543f21b0b74f87fc373e47a8cd0f1f6e41a630f8d29d1e7fd2a909bfe788" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.574798 4835 scope.go:117] "RemoveContainer" containerID="8b3d90d845d9ab8d6b2f47017505434a48fbc8326487b7c095bca2d931dcfabf" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.580458 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2g5kk"] Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.594730 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2g5kk"] Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.614424 4835 scope.go:117] "RemoveContainer" containerID="c33acb0100a624f65e9200e8ea97b82edeea2571024877e67554fa7a6d61dd01" Oct 03 19:31:56 crc kubenswrapper[4835]: E1003 19:31:56.615137 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c33acb0100a624f65e9200e8ea97b82edeea2571024877e67554fa7a6d61dd01\": container with ID starting with c33acb0100a624f65e9200e8ea97b82edeea2571024877e67554fa7a6d61dd01 not found: ID does not exist" containerID="c33acb0100a624f65e9200e8ea97b82edeea2571024877e67554fa7a6d61dd01" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.615199 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c33acb0100a624f65e9200e8ea97b82edeea2571024877e67554fa7a6d61dd01"} err="failed to get container status \"c33acb0100a624f65e9200e8ea97b82edeea2571024877e67554fa7a6d61dd01\": rpc error: code = NotFound desc = could not find container \"c33acb0100a624f65e9200e8ea97b82edeea2571024877e67554fa7a6d61dd01\": container with ID starting with c33acb0100a624f65e9200e8ea97b82edeea2571024877e67554fa7a6d61dd01 not found: ID does not exist" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.615234 4835 scope.go:117] "RemoveContainer" containerID="8ef3543f21b0b74f87fc373e47a8cd0f1f6e41a630f8d29d1e7fd2a909bfe788" Oct 03 19:31:56 crc kubenswrapper[4835]: E1003 19:31:56.615828 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef3543f21b0b74f87fc373e47a8cd0f1f6e41a630f8d29d1e7fd2a909bfe788\": container with ID starting with 8ef3543f21b0b74f87fc373e47a8cd0f1f6e41a630f8d29d1e7fd2a909bfe788 not found: ID does not exist" containerID="8ef3543f21b0b74f87fc373e47a8cd0f1f6e41a630f8d29d1e7fd2a909bfe788" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.615916 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef3543f21b0b74f87fc373e47a8cd0f1f6e41a630f8d29d1e7fd2a909bfe788"} err="failed to get container status \"8ef3543f21b0b74f87fc373e47a8cd0f1f6e41a630f8d29d1e7fd2a909bfe788\": rpc error: code = NotFound desc = could not find container \"8ef3543f21b0b74f87fc373e47a8cd0f1f6e41a630f8d29d1e7fd2a909bfe788\": container with ID starting with 8ef3543f21b0b74f87fc373e47a8cd0f1f6e41a630f8d29d1e7fd2a909bfe788 not found: ID does not exist" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.615964 4835 scope.go:117] "RemoveContainer" containerID="8b3d90d845d9ab8d6b2f47017505434a48fbc8326487b7c095bca2d931dcfabf" Oct 03 19:31:56 crc kubenswrapper[4835]: E1003 19:31:56.616542 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3d90d845d9ab8d6b2f47017505434a48fbc8326487b7c095bca2d931dcfabf\": container with ID starting with 8b3d90d845d9ab8d6b2f47017505434a48fbc8326487b7c095bca2d931dcfabf not found: ID does not exist" containerID="8b3d90d845d9ab8d6b2f47017505434a48fbc8326487b7c095bca2d931dcfabf" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.616595 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3d90d845d9ab8d6b2f47017505434a48fbc8326487b7c095bca2d931dcfabf"} err="failed to get container status \"8b3d90d845d9ab8d6b2f47017505434a48fbc8326487b7c095bca2d931dcfabf\": rpc error: code = NotFound desc = could not find container \"8b3d90d845d9ab8d6b2f47017505434a48fbc8326487b7c095bca2d931dcfabf\": container with ID starting with 8b3d90d845d9ab8d6b2f47017505434a48fbc8326487b7c095bca2d931dcfabf not found: ID does not exist" Oct 03 19:31:56 crc kubenswrapper[4835]: I1003 19:31:56.892720 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e9f595a-d3de-4f71-a375-187f905f834a" path="/var/lib/kubelet/pods/1e9f595a-d3de-4f71-a375-187f905f834a/volumes" Oct 03 19:31:57 crc kubenswrapper[4835]: I1003 19:31:57.451619 4835 generic.go:334] "Generic (PLEG): container finished" podID="130d7ad5-049e-4bd8-9029-23ccbce73b7e" containerID="6eb1b633010d7220fc1d078f9982b74d405629a929e312f353c0f6e66992d557" exitCode=0 Oct 03 19:31:57 crc kubenswrapper[4835]: I1003 19:31:57.451714 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9j6m7" event={"ID":"130d7ad5-049e-4bd8-9029-23ccbce73b7e","Type":"ContainerDied","Data":"6eb1b633010d7220fc1d078f9982b74d405629a929e312f353c0f6e66992d557"} Oct 03 19:31:58 crc kubenswrapper[4835]: I1003 19:31:58.466958 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9j6m7" event={"ID":"130d7ad5-049e-4bd8-9029-23ccbce73b7e","Type":"ContainerStarted","Data":"c2a135f1e1161e1c587c118d61243a5dfe8fed3150dc4540079475b6243bf376"} Oct 03 19:31:59 crc kubenswrapper[4835]: I1003 19:31:59.481918 4835 generic.go:334] "Generic (PLEG): container finished" podID="130d7ad5-049e-4bd8-9029-23ccbce73b7e" containerID="c2a135f1e1161e1c587c118d61243a5dfe8fed3150dc4540079475b6243bf376" exitCode=0 Oct 03 19:31:59 crc kubenswrapper[4835]: I1003 19:31:59.481982 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9j6m7" event={"ID":"130d7ad5-049e-4bd8-9029-23ccbce73b7e","Type":"ContainerDied","Data":"c2a135f1e1161e1c587c118d61243a5dfe8fed3150dc4540079475b6243bf376"} Oct 03 19:32:00 crc kubenswrapper[4835]: I1003 19:32:00.510269 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9j6m7" event={"ID":"130d7ad5-049e-4bd8-9029-23ccbce73b7e","Type":"ContainerStarted","Data":"13f2ec8ee0fcc8fb60a561b970aa52a38b6146457996bae87949b5281414a8c8"} Oct 03 19:32:00 crc kubenswrapper[4835]: I1003 19:32:00.530854 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9j6m7" podStartSLOduration=3.049955261 podStartE2EDuration="5.530831369s" podCreationTimestamp="2025-10-03 19:31:55 +0000 UTC" firstStartedPulling="2025-10-03 19:31:57.455047501 +0000 UTC m=+4659.170988383" lastFinishedPulling="2025-10-03 19:31:59.935923619 +0000 UTC m=+4661.651864491" observedRunningTime="2025-10-03 19:32:00.527329513 +0000 UTC m=+4662.243270385" watchObservedRunningTime="2025-10-03 19:32:00.530831369 +0000 UTC m=+4662.246772241" Oct 03 19:32:05 crc kubenswrapper[4835]: I1003 19:32:05.501521 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:32:05 crc kubenswrapper[4835]: I1003 19:32:05.502503 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:32:05 crc kubenswrapper[4835]: I1003 19:32:05.566314 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:32:05 crc kubenswrapper[4835]: I1003 19:32:05.642643 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:32:05 crc kubenswrapper[4835]: I1003 19:32:05.813270 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9j6m7"] Oct 03 19:32:07 crc kubenswrapper[4835]: I1003 19:32:07.599406 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9j6m7" podUID="130d7ad5-049e-4bd8-9029-23ccbce73b7e" containerName="registry-server" containerID="cri-o://13f2ec8ee0fcc8fb60a561b970aa52a38b6146457996bae87949b5281414a8c8" gracePeriod=2 Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.172276 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.235369 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/130d7ad5-049e-4bd8-9029-23ccbce73b7e-catalog-content\") pod \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\" (UID: \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\") " Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.235550 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/130d7ad5-049e-4bd8-9029-23ccbce73b7e-utilities\") pod \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\" (UID: \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\") " Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.235698 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56fkp\" (UniqueName: \"kubernetes.io/projected/130d7ad5-049e-4bd8-9029-23ccbce73b7e-kube-api-access-56fkp\") pod \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\" (UID: \"130d7ad5-049e-4bd8-9029-23ccbce73b7e\") " Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.237601 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/130d7ad5-049e-4bd8-9029-23ccbce73b7e-utilities" (OuterVolumeSpecName: "utilities") pod "130d7ad5-049e-4bd8-9029-23ccbce73b7e" (UID: "130d7ad5-049e-4bd8-9029-23ccbce73b7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.243049 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130d7ad5-049e-4bd8-9029-23ccbce73b7e-kube-api-access-56fkp" (OuterVolumeSpecName: "kube-api-access-56fkp") pod "130d7ad5-049e-4bd8-9029-23ccbce73b7e" (UID: "130d7ad5-049e-4bd8-9029-23ccbce73b7e"). InnerVolumeSpecName "kube-api-access-56fkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.296508 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/130d7ad5-049e-4bd8-9029-23ccbce73b7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "130d7ad5-049e-4bd8-9029-23ccbce73b7e" (UID: "130d7ad5-049e-4bd8-9029-23ccbce73b7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.338967 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/130d7ad5-049e-4bd8-9029-23ccbce73b7e-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.339017 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56fkp\" (UniqueName: \"kubernetes.io/projected/130d7ad5-049e-4bd8-9029-23ccbce73b7e-kube-api-access-56fkp\") on node \"crc\" DevicePath \"\"" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.339032 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/130d7ad5-049e-4bd8-9029-23ccbce73b7e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.615192 4835 generic.go:334] "Generic (PLEG): container finished" podID="130d7ad5-049e-4bd8-9029-23ccbce73b7e" containerID="13f2ec8ee0fcc8fb60a561b970aa52a38b6146457996bae87949b5281414a8c8" exitCode=0 Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.615249 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9j6m7" event={"ID":"130d7ad5-049e-4bd8-9029-23ccbce73b7e","Type":"ContainerDied","Data":"13f2ec8ee0fcc8fb60a561b970aa52a38b6146457996bae87949b5281414a8c8"} Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.615281 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9j6m7" event={"ID":"130d7ad5-049e-4bd8-9029-23ccbce73b7e","Type":"ContainerDied","Data":"0bb54a5a75969781acf1e3cff8a2c3da4f4c50fed3211289f90489b38c5a9f11"} Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.615307 4835 scope.go:117] "RemoveContainer" containerID="13f2ec8ee0fcc8fb60a561b970aa52a38b6146457996bae87949b5281414a8c8" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.615338 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9j6m7" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.671100 4835 scope.go:117] "RemoveContainer" containerID="c2a135f1e1161e1c587c118d61243a5dfe8fed3150dc4540079475b6243bf376" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.675151 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9j6m7"] Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.683475 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9j6m7"] Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.716787 4835 scope.go:117] "RemoveContainer" containerID="6eb1b633010d7220fc1d078f9982b74d405629a929e312f353c0f6e66992d557" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.782099 4835 scope.go:117] "RemoveContainer" containerID="13f2ec8ee0fcc8fb60a561b970aa52a38b6146457996bae87949b5281414a8c8" Oct 03 19:32:08 crc kubenswrapper[4835]: E1003 19:32:08.782818 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13f2ec8ee0fcc8fb60a561b970aa52a38b6146457996bae87949b5281414a8c8\": container with ID starting with 13f2ec8ee0fcc8fb60a561b970aa52a38b6146457996bae87949b5281414a8c8 not found: ID does not exist" containerID="13f2ec8ee0fcc8fb60a561b970aa52a38b6146457996bae87949b5281414a8c8" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.782894 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13f2ec8ee0fcc8fb60a561b970aa52a38b6146457996bae87949b5281414a8c8"} err="failed to get container status \"13f2ec8ee0fcc8fb60a561b970aa52a38b6146457996bae87949b5281414a8c8\": rpc error: code = NotFound desc = could not find container \"13f2ec8ee0fcc8fb60a561b970aa52a38b6146457996bae87949b5281414a8c8\": container with ID starting with 13f2ec8ee0fcc8fb60a561b970aa52a38b6146457996bae87949b5281414a8c8 not found: ID does not exist" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.782939 4835 scope.go:117] "RemoveContainer" containerID="c2a135f1e1161e1c587c118d61243a5dfe8fed3150dc4540079475b6243bf376" Oct 03 19:32:08 crc kubenswrapper[4835]: E1003 19:32:08.783339 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2a135f1e1161e1c587c118d61243a5dfe8fed3150dc4540079475b6243bf376\": container with ID starting with c2a135f1e1161e1c587c118d61243a5dfe8fed3150dc4540079475b6243bf376 not found: ID does not exist" containerID="c2a135f1e1161e1c587c118d61243a5dfe8fed3150dc4540079475b6243bf376" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.783390 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2a135f1e1161e1c587c118d61243a5dfe8fed3150dc4540079475b6243bf376"} err="failed to get container status \"c2a135f1e1161e1c587c118d61243a5dfe8fed3150dc4540079475b6243bf376\": rpc error: code = NotFound desc = could not find container \"c2a135f1e1161e1c587c118d61243a5dfe8fed3150dc4540079475b6243bf376\": container with ID starting with c2a135f1e1161e1c587c118d61243a5dfe8fed3150dc4540079475b6243bf376 not found: ID does not exist" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.783422 4835 scope.go:117] "RemoveContainer" containerID="6eb1b633010d7220fc1d078f9982b74d405629a929e312f353c0f6e66992d557" Oct 03 19:32:08 crc kubenswrapper[4835]: E1003 19:32:08.783691 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eb1b633010d7220fc1d078f9982b74d405629a929e312f353c0f6e66992d557\": container with ID starting with 6eb1b633010d7220fc1d078f9982b74d405629a929e312f353c0f6e66992d557 not found: ID does not exist" containerID="6eb1b633010d7220fc1d078f9982b74d405629a929e312f353c0f6e66992d557" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.783720 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb1b633010d7220fc1d078f9982b74d405629a929e312f353c0f6e66992d557"} err="failed to get container status \"6eb1b633010d7220fc1d078f9982b74d405629a929e312f353c0f6e66992d557\": rpc error: code = NotFound desc = could not find container \"6eb1b633010d7220fc1d078f9982b74d405629a929e312f353c0f6e66992d557\": container with ID starting with 6eb1b633010d7220fc1d078f9982b74d405629a929e312f353c0f6e66992d557 not found: ID does not exist" Oct 03 19:32:08 crc kubenswrapper[4835]: I1003 19:32:08.896010 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="130d7ad5-049e-4bd8-9029-23ccbce73b7e" path="/var/lib/kubelet/pods/130d7ad5-049e-4bd8-9029-23ccbce73b7e/volumes" Oct 03 19:34:05 crc kubenswrapper[4835]: I1003 19:34:05.359162 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:34:05 crc kubenswrapper[4835]: I1003 19:34:05.360053 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:34:35 crc kubenswrapper[4835]: I1003 19:34:35.358481 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:34:35 crc kubenswrapper[4835]: I1003 19:34:35.359282 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:35:05 crc kubenswrapper[4835]: I1003 19:35:05.359048 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:35:05 crc kubenswrapper[4835]: I1003 19:35:05.361825 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:35:05 crc kubenswrapper[4835]: I1003 19:35:05.361954 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 19:35:05 crc kubenswrapper[4835]: I1003 19:35:05.363042 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a8319c7829fb9a8b7febda9558f8244427ce5144e79867d556f18cceb6475d4"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 19:35:05 crc kubenswrapper[4835]: I1003 19:35:05.363211 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://1a8319c7829fb9a8b7febda9558f8244427ce5144e79867d556f18cceb6475d4" gracePeriod=600 Oct 03 19:35:05 crc kubenswrapper[4835]: I1003 19:35:05.775879 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="1a8319c7829fb9a8b7febda9558f8244427ce5144e79867d556f18cceb6475d4" exitCode=0 Oct 03 19:35:05 crc kubenswrapper[4835]: I1003 19:35:05.776039 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"1a8319c7829fb9a8b7febda9558f8244427ce5144e79867d556f18cceb6475d4"} Oct 03 19:35:05 crc kubenswrapper[4835]: I1003 19:35:05.776899 4835 scope.go:117] "RemoveContainer" containerID="4e2ec17b27855e3ab5371b84ec506e70083f39ab37f770191eaf2b592e26f0dc" Oct 03 19:35:06 crc kubenswrapper[4835]: I1003 19:35:06.791198 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9"} Oct 03 19:37:35 crc kubenswrapper[4835]: I1003 19:37:35.359521 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:37:35 crc kubenswrapper[4835]: I1003 19:37:35.360421 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:38:05 crc kubenswrapper[4835]: I1003 19:38:05.358733 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:38:05 crc kubenswrapper[4835]: I1003 19:38:05.359684 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:38:35 crc kubenswrapper[4835]: I1003 19:38:35.359637 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:38:35 crc kubenswrapper[4835]: I1003 19:38:35.360860 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:38:35 crc kubenswrapper[4835]: I1003 19:38:35.361453 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 19:38:35 crc kubenswrapper[4835]: I1003 19:38:35.362781 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 19:38:35 crc kubenswrapper[4835]: I1003 19:38:35.362876 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" gracePeriod=600 Oct 03 19:38:35 crc kubenswrapper[4835]: E1003 19:38:35.540804 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:38:36 crc kubenswrapper[4835]: I1003 19:38:36.337449 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" exitCode=0 Oct 03 19:38:36 crc kubenswrapper[4835]: I1003 19:38:36.337514 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9"} Oct 03 19:38:36 crc kubenswrapper[4835]: I1003 19:38:36.337565 4835 scope.go:117] "RemoveContainer" containerID="1a8319c7829fb9a8b7febda9558f8244427ce5144e79867d556f18cceb6475d4" Oct 03 19:38:36 crc kubenswrapper[4835]: I1003 19:38:36.338732 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:38:36 crc kubenswrapper[4835]: E1003 19:38:36.339094 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:38:47 crc kubenswrapper[4835]: I1003 19:38:47.877787 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:38:47 crc kubenswrapper[4835]: E1003 19:38:47.878932 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:39:00 crc kubenswrapper[4835]: I1003 19:39:00.878795 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:39:00 crc kubenswrapper[4835]: E1003 19:39:00.880143 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:39:12 crc kubenswrapper[4835]: I1003 19:39:12.877802 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:39:12 crc kubenswrapper[4835]: E1003 19:39:12.878999 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:39:23 crc kubenswrapper[4835]: I1003 19:39:23.877744 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:39:23 crc kubenswrapper[4835]: E1003 19:39:23.878891 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:39:38 crc kubenswrapper[4835]: I1003 19:39:38.889678 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:39:38 crc kubenswrapper[4835]: E1003 19:39:38.891021 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:39:51 crc kubenswrapper[4835]: I1003 19:39:51.878255 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:39:51 crc kubenswrapper[4835]: E1003 19:39:51.879409 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:40:02 crc kubenswrapper[4835]: I1003 19:40:02.877123 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:40:02 crc kubenswrapper[4835]: E1003 19:40:02.878094 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:40:17 crc kubenswrapper[4835]: I1003 19:40:17.877607 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:40:17 crc kubenswrapper[4835]: E1003 19:40:17.878714 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:40:28 crc kubenswrapper[4835]: I1003 19:40:28.877163 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:40:28 crc kubenswrapper[4835]: E1003 19:40:28.878167 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:40:40 crc kubenswrapper[4835]: I1003 19:40:40.877967 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:40:40 crc kubenswrapper[4835]: E1003 19:40:40.879052 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:40:54 crc kubenswrapper[4835]: I1003 19:40:54.878479 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:40:54 crc kubenswrapper[4835]: E1003 19:40:54.879532 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:41:09 crc kubenswrapper[4835]: I1003 19:41:09.877138 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:41:09 crc kubenswrapper[4835]: E1003 19:41:09.878386 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:41:22 crc kubenswrapper[4835]: I1003 19:41:22.877628 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:41:22 crc kubenswrapper[4835]: E1003 19:41:22.878862 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:41:34 crc kubenswrapper[4835]: I1003 19:41:34.877018 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:41:34 crc kubenswrapper[4835]: E1003 19:41:34.878132 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.532262 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gwvhb"] Oct 03 19:41:41 crc kubenswrapper[4835]: E1003 19:41:41.537518 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9f595a-d3de-4f71-a375-187f905f834a" containerName="extract-content" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.537802 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9f595a-d3de-4f71-a375-187f905f834a" containerName="extract-content" Oct 03 19:41:41 crc kubenswrapper[4835]: E1003 19:41:41.538048 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9f595a-d3de-4f71-a375-187f905f834a" containerName="registry-server" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.538141 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9f595a-d3de-4f71-a375-187f905f834a" containerName="registry-server" Oct 03 19:41:41 crc kubenswrapper[4835]: E1003 19:41:41.538226 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9f595a-d3de-4f71-a375-187f905f834a" containerName="extract-utilities" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.538397 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9f595a-d3de-4f71-a375-187f905f834a" containerName="extract-utilities" Oct 03 19:41:41 crc kubenswrapper[4835]: E1003 19:41:41.538503 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130d7ad5-049e-4bd8-9029-23ccbce73b7e" containerName="registry-server" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.538569 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="130d7ad5-049e-4bd8-9029-23ccbce73b7e" containerName="registry-server" Oct 03 19:41:41 crc kubenswrapper[4835]: E1003 19:41:41.538634 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130d7ad5-049e-4bd8-9029-23ccbce73b7e" containerName="extract-utilities" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.538749 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="130d7ad5-049e-4bd8-9029-23ccbce73b7e" containerName="extract-utilities" Oct 03 19:41:41 crc kubenswrapper[4835]: E1003 19:41:41.538838 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130d7ad5-049e-4bd8-9029-23ccbce73b7e" containerName="extract-content" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.539011 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="130d7ad5-049e-4bd8-9029-23ccbce73b7e" containerName="extract-content" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.539647 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="130d7ad5-049e-4bd8-9029-23ccbce73b7e" containerName="registry-server" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.539753 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9f595a-d3de-4f71-a375-187f905f834a" containerName="registry-server" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.541687 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.556812 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwvhb"] Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.573816 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb3d2705-f4b5-4fac-92f3-cedaec209580-catalog-content\") pod \"certified-operators-gwvhb\" (UID: \"cb3d2705-f4b5-4fac-92f3-cedaec209580\") " pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.573881 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m58tx\" (UniqueName: \"kubernetes.io/projected/cb3d2705-f4b5-4fac-92f3-cedaec209580-kube-api-access-m58tx\") pod \"certified-operators-gwvhb\" (UID: \"cb3d2705-f4b5-4fac-92f3-cedaec209580\") " pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.574013 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb3d2705-f4b5-4fac-92f3-cedaec209580-utilities\") pod \"certified-operators-gwvhb\" (UID: \"cb3d2705-f4b5-4fac-92f3-cedaec209580\") " pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.678968 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb3d2705-f4b5-4fac-92f3-cedaec209580-utilities\") pod \"certified-operators-gwvhb\" (UID: \"cb3d2705-f4b5-4fac-92f3-cedaec209580\") " pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.680176 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb3d2705-f4b5-4fac-92f3-cedaec209580-utilities\") pod \"certified-operators-gwvhb\" (UID: \"cb3d2705-f4b5-4fac-92f3-cedaec209580\") " pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.681182 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb3d2705-f4b5-4fac-92f3-cedaec209580-catalog-content\") pod \"certified-operators-gwvhb\" (UID: \"cb3d2705-f4b5-4fac-92f3-cedaec209580\") " pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.681351 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m58tx\" (UniqueName: \"kubernetes.io/projected/cb3d2705-f4b5-4fac-92f3-cedaec209580-kube-api-access-m58tx\") pod \"certified-operators-gwvhb\" (UID: \"cb3d2705-f4b5-4fac-92f3-cedaec209580\") " pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.682060 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb3d2705-f4b5-4fac-92f3-cedaec209580-catalog-content\") pod \"certified-operators-gwvhb\" (UID: \"cb3d2705-f4b5-4fac-92f3-cedaec209580\") " pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.710102 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m58tx\" (UniqueName: \"kubernetes.io/projected/cb3d2705-f4b5-4fac-92f3-cedaec209580-kube-api-access-m58tx\") pod \"certified-operators-gwvhb\" (UID: \"cb3d2705-f4b5-4fac-92f3-cedaec209580\") " pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:41 crc kubenswrapper[4835]: I1003 19:41:41.872802 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:42 crc kubenswrapper[4835]: I1003 19:41:42.509505 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwvhb"] Oct 03 19:41:42 crc kubenswrapper[4835]: W1003 19:41:42.511273 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb3d2705_f4b5_4fac_92f3_cedaec209580.slice/crio-71c9f7f91c12f9d6e2378bc9301d1e4a8d53d1a80006610ddc3c341055389108 WatchSource:0}: Error finding container 71c9f7f91c12f9d6e2378bc9301d1e4a8d53d1a80006610ddc3c341055389108: Status 404 returned error can't find the container with id 71c9f7f91c12f9d6e2378bc9301d1e4a8d53d1a80006610ddc3c341055389108 Oct 03 19:41:42 crc kubenswrapper[4835]: I1003 19:41:42.608911 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwvhb" event={"ID":"cb3d2705-f4b5-4fac-92f3-cedaec209580","Type":"ContainerStarted","Data":"71c9f7f91c12f9d6e2378bc9301d1e4a8d53d1a80006610ddc3c341055389108"} Oct 03 19:41:43 crc kubenswrapper[4835]: I1003 19:41:43.624304 4835 generic.go:334] "Generic (PLEG): container finished" podID="cb3d2705-f4b5-4fac-92f3-cedaec209580" containerID="0985f2a09c774b947329714f593ddd9d5d8cd8b8d0cce87ef2a0ac58e91c890c" exitCode=0 Oct 03 19:41:43 crc kubenswrapper[4835]: I1003 19:41:43.624495 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwvhb" event={"ID":"cb3d2705-f4b5-4fac-92f3-cedaec209580","Type":"ContainerDied","Data":"0985f2a09c774b947329714f593ddd9d5d8cd8b8d0cce87ef2a0ac58e91c890c"} Oct 03 19:41:43 crc kubenswrapper[4835]: I1003 19:41:43.626961 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 19:41:46 crc kubenswrapper[4835]: I1003 19:41:46.661052 4835 generic.go:334] "Generic (PLEG): container finished" podID="cb3d2705-f4b5-4fac-92f3-cedaec209580" containerID="b182d9f411b08f293f954d4a279c9766516a8a35d9238bcac8867328f6647596" exitCode=0 Oct 03 19:41:46 crc kubenswrapper[4835]: I1003 19:41:46.661199 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwvhb" event={"ID":"cb3d2705-f4b5-4fac-92f3-cedaec209580","Type":"ContainerDied","Data":"b182d9f411b08f293f954d4a279c9766516a8a35d9238bcac8867328f6647596"} Oct 03 19:41:48 crc kubenswrapper[4835]: I1003 19:41:48.690499 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwvhb" event={"ID":"cb3d2705-f4b5-4fac-92f3-cedaec209580","Type":"ContainerStarted","Data":"c40607fb3f6709e8646ede5fe62a56e672bd0d6a7dd7353dc5abcf81e2d83ae9"} Oct 03 19:41:48 crc kubenswrapper[4835]: I1003 19:41:48.715154 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gwvhb" podStartSLOduration=3.515772926 podStartE2EDuration="7.715117912s" podCreationTimestamp="2025-10-03 19:41:41 +0000 UTC" firstStartedPulling="2025-10-03 19:41:43.626747002 +0000 UTC m=+5245.342687874" lastFinishedPulling="2025-10-03 19:41:47.826091998 +0000 UTC m=+5249.542032860" observedRunningTime="2025-10-03 19:41:48.712671751 +0000 UTC m=+5250.428612643" watchObservedRunningTime="2025-10-03 19:41:48.715117912 +0000 UTC m=+5250.431058794" Oct 03 19:41:48 crc kubenswrapper[4835]: I1003 19:41:48.886318 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:41:48 crc kubenswrapper[4835]: E1003 19:41:48.886628 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:41:51 crc kubenswrapper[4835]: I1003 19:41:51.873878 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:51 crc kubenswrapper[4835]: I1003 19:41:51.874502 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:51 crc kubenswrapper[4835]: I1003 19:41:51.945495 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:41:58 crc kubenswrapper[4835]: I1003 19:41:58.374485 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-42g5f"] Oct 03 19:41:58 crc kubenswrapper[4835]: I1003 19:41:58.377952 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:41:58 crc kubenswrapper[4835]: I1003 19:41:58.389823 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-42g5f"] Oct 03 19:41:58 crc kubenswrapper[4835]: I1003 19:41:58.503115 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557cbe87-0345-47a7-8c73-eb52ddd7be64-utilities\") pod \"redhat-operators-42g5f\" (UID: \"557cbe87-0345-47a7-8c73-eb52ddd7be64\") " pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:41:58 crc kubenswrapper[4835]: I1003 19:41:58.503346 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhvgn\" (UniqueName: \"kubernetes.io/projected/557cbe87-0345-47a7-8c73-eb52ddd7be64-kube-api-access-mhvgn\") pod \"redhat-operators-42g5f\" (UID: \"557cbe87-0345-47a7-8c73-eb52ddd7be64\") " pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:41:58 crc kubenswrapper[4835]: I1003 19:41:58.503548 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557cbe87-0345-47a7-8c73-eb52ddd7be64-catalog-content\") pod \"redhat-operators-42g5f\" (UID: \"557cbe87-0345-47a7-8c73-eb52ddd7be64\") " pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:41:58 crc kubenswrapper[4835]: I1003 19:41:58.605362 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557cbe87-0345-47a7-8c73-eb52ddd7be64-utilities\") pod \"redhat-operators-42g5f\" (UID: \"557cbe87-0345-47a7-8c73-eb52ddd7be64\") " pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:41:58 crc kubenswrapper[4835]: I1003 19:41:58.605491 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhvgn\" (UniqueName: \"kubernetes.io/projected/557cbe87-0345-47a7-8c73-eb52ddd7be64-kube-api-access-mhvgn\") pod \"redhat-operators-42g5f\" (UID: \"557cbe87-0345-47a7-8c73-eb52ddd7be64\") " pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:41:58 crc kubenswrapper[4835]: I1003 19:41:58.605571 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557cbe87-0345-47a7-8c73-eb52ddd7be64-catalog-content\") pod \"redhat-operators-42g5f\" (UID: \"557cbe87-0345-47a7-8c73-eb52ddd7be64\") " pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:41:58 crc kubenswrapper[4835]: I1003 19:41:58.860034 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557cbe87-0345-47a7-8c73-eb52ddd7be64-utilities\") pod \"redhat-operators-42g5f\" (UID: \"557cbe87-0345-47a7-8c73-eb52ddd7be64\") " pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:41:58 crc kubenswrapper[4835]: I1003 19:41:58.860252 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557cbe87-0345-47a7-8c73-eb52ddd7be64-catalog-content\") pod \"redhat-operators-42g5f\" (UID: \"557cbe87-0345-47a7-8c73-eb52ddd7be64\") " pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:41:59 crc kubenswrapper[4835]: I1003 19:41:59.019678 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhvgn\" (UniqueName: \"kubernetes.io/projected/557cbe87-0345-47a7-8c73-eb52ddd7be64-kube-api-access-mhvgn\") pod \"redhat-operators-42g5f\" (UID: \"557cbe87-0345-47a7-8c73-eb52ddd7be64\") " pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:41:59 crc kubenswrapper[4835]: I1003 19:41:59.172344 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:41:59 crc kubenswrapper[4835]: I1003 19:41:59.744200 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-42g5f"] Oct 03 19:41:59 crc kubenswrapper[4835]: I1003 19:41:59.826570 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42g5f" event={"ID":"557cbe87-0345-47a7-8c73-eb52ddd7be64","Type":"ContainerStarted","Data":"132fa09da634bcddecaea076698267a9f91e6268a02e4622d035931c653f7adb"} Oct 03 19:42:00 crc kubenswrapper[4835]: I1003 19:42:00.848048 4835 generic.go:334] "Generic (PLEG): container finished" podID="557cbe87-0345-47a7-8c73-eb52ddd7be64" containerID="a6a1d3f7e8e38ac23260220a24385859f41dcf32aabb663ae31e183095e92d33" exitCode=0 Oct 03 19:42:00 crc kubenswrapper[4835]: I1003 19:42:00.848247 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42g5f" event={"ID":"557cbe87-0345-47a7-8c73-eb52ddd7be64","Type":"ContainerDied","Data":"a6a1d3f7e8e38ac23260220a24385859f41dcf32aabb663ae31e183095e92d33"} Oct 03 19:42:01 crc kubenswrapper[4835]: I1003 19:42:01.877871 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:42:01 crc kubenswrapper[4835]: E1003 19:42:01.878722 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:42:01 crc kubenswrapper[4835]: I1003 19:42:01.940999 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:42:02 crc kubenswrapper[4835]: I1003 19:42:02.741344 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwvhb"] Oct 03 19:42:02 crc kubenswrapper[4835]: I1003 19:42:02.874306 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42g5f" event={"ID":"557cbe87-0345-47a7-8c73-eb52ddd7be64","Type":"ContainerStarted","Data":"d5a298dc101cebded51b07989873ddf8187d64ba49828d94d658fc8638179dcc"} Oct 03 19:42:02 crc kubenswrapper[4835]: I1003 19:42:02.874572 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gwvhb" podUID="cb3d2705-f4b5-4fac-92f3-cedaec209580" containerName="registry-server" containerID="cri-o://c40607fb3f6709e8646ede5fe62a56e672bd0d6a7dd7353dc5abcf81e2d83ae9" gracePeriod=2 Oct 03 19:42:03 crc kubenswrapper[4835]: I1003 19:42:03.888000 4835 generic.go:334] "Generic (PLEG): container finished" podID="cb3d2705-f4b5-4fac-92f3-cedaec209580" containerID="c40607fb3f6709e8646ede5fe62a56e672bd0d6a7dd7353dc5abcf81e2d83ae9" exitCode=0 Oct 03 19:42:03 crc kubenswrapper[4835]: I1003 19:42:03.888093 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwvhb" event={"ID":"cb3d2705-f4b5-4fac-92f3-cedaec209580","Type":"ContainerDied","Data":"c40607fb3f6709e8646ede5fe62a56e672bd0d6a7dd7353dc5abcf81e2d83ae9"} Oct 03 19:42:03 crc kubenswrapper[4835]: I1003 19:42:03.891188 4835 generic.go:334] "Generic (PLEG): container finished" podID="557cbe87-0345-47a7-8c73-eb52ddd7be64" containerID="d5a298dc101cebded51b07989873ddf8187d64ba49828d94d658fc8638179dcc" exitCode=0 Oct 03 19:42:03 crc kubenswrapper[4835]: I1003 19:42:03.891232 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42g5f" event={"ID":"557cbe87-0345-47a7-8c73-eb52ddd7be64","Type":"ContainerDied","Data":"d5a298dc101cebded51b07989873ddf8187d64ba49828d94d658fc8638179dcc"} Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.234896 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.295515 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m58tx\" (UniqueName: \"kubernetes.io/projected/cb3d2705-f4b5-4fac-92f3-cedaec209580-kube-api-access-m58tx\") pod \"cb3d2705-f4b5-4fac-92f3-cedaec209580\" (UID: \"cb3d2705-f4b5-4fac-92f3-cedaec209580\") " Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.295604 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb3d2705-f4b5-4fac-92f3-cedaec209580-utilities\") pod \"cb3d2705-f4b5-4fac-92f3-cedaec209580\" (UID: \"cb3d2705-f4b5-4fac-92f3-cedaec209580\") " Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.295713 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb3d2705-f4b5-4fac-92f3-cedaec209580-catalog-content\") pod \"cb3d2705-f4b5-4fac-92f3-cedaec209580\" (UID: \"cb3d2705-f4b5-4fac-92f3-cedaec209580\") " Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.296459 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb3d2705-f4b5-4fac-92f3-cedaec209580-utilities" (OuterVolumeSpecName: "utilities") pod "cb3d2705-f4b5-4fac-92f3-cedaec209580" (UID: "cb3d2705-f4b5-4fac-92f3-cedaec209580"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.307489 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb3d2705-f4b5-4fac-92f3-cedaec209580-kube-api-access-m58tx" (OuterVolumeSpecName: "kube-api-access-m58tx") pod "cb3d2705-f4b5-4fac-92f3-cedaec209580" (UID: "cb3d2705-f4b5-4fac-92f3-cedaec209580"). InnerVolumeSpecName "kube-api-access-m58tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.336574 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb3d2705-f4b5-4fac-92f3-cedaec209580-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb3d2705-f4b5-4fac-92f3-cedaec209580" (UID: "cb3d2705-f4b5-4fac-92f3-cedaec209580"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.399728 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m58tx\" (UniqueName: \"kubernetes.io/projected/cb3d2705-f4b5-4fac-92f3-cedaec209580-kube-api-access-m58tx\") on node \"crc\" DevicePath \"\"" Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.399775 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb3d2705-f4b5-4fac-92f3-cedaec209580-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.399787 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb3d2705-f4b5-4fac-92f3-cedaec209580-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.903638 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwvhb" event={"ID":"cb3d2705-f4b5-4fac-92f3-cedaec209580","Type":"ContainerDied","Data":"71c9f7f91c12f9d6e2378bc9301d1e4a8d53d1a80006610ddc3c341055389108"} Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.903719 4835 scope.go:117] "RemoveContainer" containerID="c40607fb3f6709e8646ede5fe62a56e672bd0d6a7dd7353dc5abcf81e2d83ae9" Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.903752 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwvhb" Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.940707 4835 scope.go:117] "RemoveContainer" containerID="b182d9f411b08f293f954d4a279c9766516a8a35d9238bcac8867328f6647596" Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.952455 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwvhb"] Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.960834 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gwvhb"] Oct 03 19:42:04 crc kubenswrapper[4835]: I1003 19:42:04.965605 4835 scope.go:117] "RemoveContainer" containerID="0985f2a09c774b947329714f593ddd9d5d8cd8b8d0cce87ef2a0ac58e91c890c" Oct 03 19:42:06 crc kubenswrapper[4835]: I1003 19:42:06.893801 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb3d2705-f4b5-4fac-92f3-cedaec209580" path="/var/lib/kubelet/pods/cb3d2705-f4b5-4fac-92f3-cedaec209580/volumes" Oct 03 19:42:06 crc kubenswrapper[4835]: I1003 19:42:06.945353 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42g5f" event={"ID":"557cbe87-0345-47a7-8c73-eb52ddd7be64","Type":"ContainerStarted","Data":"924d9539e24d640a9d5cf8759f4349eb8a3e13b96a0c64a0c9063530134f5a96"} Oct 03 19:42:06 crc kubenswrapper[4835]: I1003 19:42:06.978685 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-42g5f" podStartSLOduration=3.156898604 podStartE2EDuration="8.978603228s" podCreationTimestamp="2025-10-03 19:41:58 +0000 UTC" firstStartedPulling="2025-10-03 19:42:00.850728585 +0000 UTC m=+5262.566669497" lastFinishedPulling="2025-10-03 19:42:06.672433249 +0000 UTC m=+5268.388374121" observedRunningTime="2025-10-03 19:42:06.961796015 +0000 UTC m=+5268.677736897" watchObservedRunningTime="2025-10-03 19:42:06.978603228 +0000 UTC m=+5268.694544100" Oct 03 19:42:09 crc kubenswrapper[4835]: I1003 19:42:09.173110 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:42:09 crc kubenswrapper[4835]: I1003 19:42:09.173579 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:42:10 crc kubenswrapper[4835]: I1003 19:42:10.231580 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-42g5f" podUID="557cbe87-0345-47a7-8c73-eb52ddd7be64" containerName="registry-server" probeResult="failure" output=< Oct 03 19:42:10 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Oct 03 19:42:10 crc kubenswrapper[4835]: > Oct 03 19:42:13 crc kubenswrapper[4835]: I1003 19:42:13.878062 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:42:13 crc kubenswrapper[4835]: E1003 19:42:13.880719 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:42:19 crc kubenswrapper[4835]: I1003 19:42:19.218815 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:42:19 crc kubenswrapper[4835]: I1003 19:42:19.289176 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:42:19 crc kubenswrapper[4835]: I1003 19:42:19.467260 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-42g5f"] Oct 03 19:42:21 crc kubenswrapper[4835]: I1003 19:42:21.105917 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-42g5f" podUID="557cbe87-0345-47a7-8c73-eb52ddd7be64" containerName="registry-server" containerID="cri-o://924d9539e24d640a9d5cf8759f4349eb8a3e13b96a0c64a0c9063530134f5a96" gracePeriod=2 Oct 03 19:42:22 crc kubenswrapper[4835]: I1003 19:42:22.123241 4835 generic.go:334] "Generic (PLEG): container finished" podID="557cbe87-0345-47a7-8c73-eb52ddd7be64" containerID="924d9539e24d640a9d5cf8759f4349eb8a3e13b96a0c64a0c9063530134f5a96" exitCode=0 Oct 03 19:42:22 crc kubenswrapper[4835]: I1003 19:42:22.123357 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42g5f" event={"ID":"557cbe87-0345-47a7-8c73-eb52ddd7be64","Type":"ContainerDied","Data":"924d9539e24d640a9d5cf8759f4349eb8a3e13b96a0c64a0c9063530134f5a96"} Oct 03 19:42:22 crc kubenswrapper[4835]: I1003 19:42:22.930289 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.076748 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhvgn\" (UniqueName: \"kubernetes.io/projected/557cbe87-0345-47a7-8c73-eb52ddd7be64-kube-api-access-mhvgn\") pod \"557cbe87-0345-47a7-8c73-eb52ddd7be64\" (UID: \"557cbe87-0345-47a7-8c73-eb52ddd7be64\") " Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.076896 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557cbe87-0345-47a7-8c73-eb52ddd7be64-catalog-content\") pod \"557cbe87-0345-47a7-8c73-eb52ddd7be64\" (UID: \"557cbe87-0345-47a7-8c73-eb52ddd7be64\") " Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.077050 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557cbe87-0345-47a7-8c73-eb52ddd7be64-utilities\") pod \"557cbe87-0345-47a7-8c73-eb52ddd7be64\" (UID: \"557cbe87-0345-47a7-8c73-eb52ddd7be64\") " Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.078136 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/557cbe87-0345-47a7-8c73-eb52ddd7be64-utilities" (OuterVolumeSpecName: "utilities") pod "557cbe87-0345-47a7-8c73-eb52ddd7be64" (UID: "557cbe87-0345-47a7-8c73-eb52ddd7be64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.083649 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/557cbe87-0345-47a7-8c73-eb52ddd7be64-kube-api-access-mhvgn" (OuterVolumeSpecName: "kube-api-access-mhvgn") pod "557cbe87-0345-47a7-8c73-eb52ddd7be64" (UID: "557cbe87-0345-47a7-8c73-eb52ddd7be64"). InnerVolumeSpecName "kube-api-access-mhvgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.137642 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42g5f" event={"ID":"557cbe87-0345-47a7-8c73-eb52ddd7be64","Type":"ContainerDied","Data":"132fa09da634bcddecaea076698267a9f91e6268a02e4622d035931c653f7adb"} Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.137724 4835 scope.go:117] "RemoveContainer" containerID="924d9539e24d640a9d5cf8759f4349eb8a3e13b96a0c64a0c9063530134f5a96" Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.137971 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42g5f" Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.169637 4835 scope.go:117] "RemoveContainer" containerID="d5a298dc101cebded51b07989873ddf8187d64ba49828d94d658fc8638179dcc" Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.181769 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/557cbe87-0345-47a7-8c73-eb52ddd7be64-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.181809 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhvgn\" (UniqueName: \"kubernetes.io/projected/557cbe87-0345-47a7-8c73-eb52ddd7be64-kube-api-access-mhvgn\") on node \"crc\" DevicePath \"\"" Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.187544 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/557cbe87-0345-47a7-8c73-eb52ddd7be64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "557cbe87-0345-47a7-8c73-eb52ddd7be64" (UID: "557cbe87-0345-47a7-8c73-eb52ddd7be64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.206912 4835 scope.go:117] "RemoveContainer" containerID="a6a1d3f7e8e38ac23260220a24385859f41dcf32aabb663ae31e183095e92d33" Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.283741 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/557cbe87-0345-47a7-8c73-eb52ddd7be64-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.479863 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-42g5f"] Oct 03 19:42:23 crc kubenswrapper[4835]: I1003 19:42:23.491085 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-42g5f"] Oct 03 19:42:24 crc kubenswrapper[4835]: I1003 19:42:24.878043 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:42:24 crc kubenswrapper[4835]: E1003 19:42:24.878763 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:42:24 crc kubenswrapper[4835]: I1003 19:42:24.894495 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="557cbe87-0345-47a7-8c73-eb52ddd7be64" path="/var/lib/kubelet/pods/557cbe87-0345-47a7-8c73-eb52ddd7be64/volumes" Oct 03 19:42:36 crc kubenswrapper[4835]: I1003 19:42:36.877699 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:42:36 crc kubenswrapper[4835]: E1003 19:42:36.879262 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:42:39 crc kubenswrapper[4835]: I1003 19:42:39.977160 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6rbvn"] Oct 03 19:42:39 crc kubenswrapper[4835]: E1003 19:42:39.980730 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3d2705-f4b5-4fac-92f3-cedaec209580" containerName="extract-utilities" Oct 03 19:42:39 crc kubenswrapper[4835]: I1003 19:42:39.980758 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3d2705-f4b5-4fac-92f3-cedaec209580" containerName="extract-utilities" Oct 03 19:42:39 crc kubenswrapper[4835]: E1003 19:42:39.980795 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557cbe87-0345-47a7-8c73-eb52ddd7be64" containerName="registry-server" Oct 03 19:42:39 crc kubenswrapper[4835]: I1003 19:42:39.980805 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="557cbe87-0345-47a7-8c73-eb52ddd7be64" containerName="registry-server" Oct 03 19:42:39 crc kubenswrapper[4835]: E1003 19:42:39.980824 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3d2705-f4b5-4fac-92f3-cedaec209580" containerName="extract-content" Oct 03 19:42:39 crc kubenswrapper[4835]: I1003 19:42:39.980837 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3d2705-f4b5-4fac-92f3-cedaec209580" containerName="extract-content" Oct 03 19:42:39 crc kubenswrapper[4835]: E1003 19:42:39.980858 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557cbe87-0345-47a7-8c73-eb52ddd7be64" containerName="extract-content" Oct 03 19:42:39 crc kubenswrapper[4835]: I1003 19:42:39.980870 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="557cbe87-0345-47a7-8c73-eb52ddd7be64" containerName="extract-content" Oct 03 19:42:39 crc kubenswrapper[4835]: E1003 19:42:39.980895 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3d2705-f4b5-4fac-92f3-cedaec209580" containerName="registry-server" Oct 03 19:42:39 crc kubenswrapper[4835]: I1003 19:42:39.980904 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3d2705-f4b5-4fac-92f3-cedaec209580" containerName="registry-server" Oct 03 19:42:39 crc kubenswrapper[4835]: E1003 19:42:39.980927 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557cbe87-0345-47a7-8c73-eb52ddd7be64" containerName="extract-utilities" Oct 03 19:42:39 crc kubenswrapper[4835]: I1003 19:42:39.980937 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="557cbe87-0345-47a7-8c73-eb52ddd7be64" containerName="extract-utilities" Oct 03 19:42:39 crc kubenswrapper[4835]: I1003 19:42:39.981527 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="557cbe87-0345-47a7-8c73-eb52ddd7be64" containerName="registry-server" Oct 03 19:42:39 crc kubenswrapper[4835]: I1003 19:42:39.981561 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3d2705-f4b5-4fac-92f3-cedaec209580" containerName="registry-server" Oct 03 19:42:39 crc kubenswrapper[4835]: I1003 19:42:39.984098 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:40 crc kubenswrapper[4835]: I1003 19:42:40.001636 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6rbvn"] Oct 03 19:42:40 crc kubenswrapper[4835]: I1003 19:42:40.019723 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-utilities\") pod \"redhat-marketplace-6rbvn\" (UID: \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\") " pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:40 crc kubenswrapper[4835]: I1003 19:42:40.022461 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-catalog-content\") pod \"redhat-marketplace-6rbvn\" (UID: \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\") " pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:40 crc kubenswrapper[4835]: I1003 19:42:40.022674 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fwsn\" (UniqueName: \"kubernetes.io/projected/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-kube-api-access-8fwsn\") pod \"redhat-marketplace-6rbvn\" (UID: \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\") " pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:40 crc kubenswrapper[4835]: I1003 19:42:40.125213 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fwsn\" (UniqueName: \"kubernetes.io/projected/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-kube-api-access-8fwsn\") pod \"redhat-marketplace-6rbvn\" (UID: \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\") " pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:40 crc kubenswrapper[4835]: I1003 19:42:40.125287 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-utilities\") pod \"redhat-marketplace-6rbvn\" (UID: \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\") " pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:40 crc kubenswrapper[4835]: I1003 19:42:40.125385 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-catalog-content\") pod \"redhat-marketplace-6rbvn\" (UID: \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\") " pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:40 crc kubenswrapper[4835]: I1003 19:42:40.125858 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-catalog-content\") pod \"redhat-marketplace-6rbvn\" (UID: \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\") " pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:40 crc kubenswrapper[4835]: I1003 19:42:40.126449 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-utilities\") pod \"redhat-marketplace-6rbvn\" (UID: \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\") " pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:40 crc kubenswrapper[4835]: I1003 19:42:40.145850 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fwsn\" (UniqueName: \"kubernetes.io/projected/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-kube-api-access-8fwsn\") pod \"redhat-marketplace-6rbvn\" (UID: \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\") " pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:40 crc kubenswrapper[4835]: I1003 19:42:40.330949 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:40 crc kubenswrapper[4835]: I1003 19:42:40.845421 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6rbvn"] Oct 03 19:42:40 crc kubenswrapper[4835]: W1003 19:42:40.859029 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b5fab2_c1eb_4dbd_8924_f7164a634e28.slice/crio-fc4feba48fc21f2c55eda048f40f8ff32e93a59346a6532fc60ec9f00a24b37a WatchSource:0}: Error finding container fc4feba48fc21f2c55eda048f40f8ff32e93a59346a6532fc60ec9f00a24b37a: Status 404 returned error can't find the container with id fc4feba48fc21f2c55eda048f40f8ff32e93a59346a6532fc60ec9f00a24b37a Oct 03 19:42:41 crc kubenswrapper[4835]: I1003 19:42:41.355149 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6rbvn" event={"ID":"a1b5fab2-c1eb-4dbd-8924-f7164a634e28","Type":"ContainerStarted","Data":"d24f0ba9bc676cd8f0fd3553cf0b682ad23465aa4d7847659c4f115b2c845dd6"} Oct 03 19:42:41 crc kubenswrapper[4835]: I1003 19:42:41.355211 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6rbvn" event={"ID":"a1b5fab2-c1eb-4dbd-8924-f7164a634e28","Type":"ContainerStarted","Data":"fc4feba48fc21f2c55eda048f40f8ff32e93a59346a6532fc60ec9f00a24b37a"} Oct 03 19:42:42 crc kubenswrapper[4835]: I1003 19:42:42.369919 4835 generic.go:334] "Generic (PLEG): container finished" podID="a1b5fab2-c1eb-4dbd-8924-f7164a634e28" containerID="d24f0ba9bc676cd8f0fd3553cf0b682ad23465aa4d7847659c4f115b2c845dd6" exitCode=0 Oct 03 19:42:42 crc kubenswrapper[4835]: I1003 19:42:42.370047 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6rbvn" event={"ID":"a1b5fab2-c1eb-4dbd-8924-f7164a634e28","Type":"ContainerDied","Data":"d24f0ba9bc676cd8f0fd3553cf0b682ad23465aa4d7847659c4f115b2c845dd6"} Oct 03 19:42:42 crc kubenswrapper[4835]: I1003 19:42:42.973382 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f4bzn"] Oct 03 19:42:42 crc kubenswrapper[4835]: I1003 19:42:42.976430 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:42:42 crc kubenswrapper[4835]: I1003 19:42:42.996570 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f4bzn"] Oct 03 19:42:43 crc kubenswrapper[4835]: I1003 19:42:43.109684 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-catalog-content\") pod \"community-operators-f4bzn\" (UID: \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\") " pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:42:43 crc kubenswrapper[4835]: I1003 19:42:43.109836 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-utilities\") pod \"community-operators-f4bzn\" (UID: \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\") " pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:42:43 crc kubenswrapper[4835]: I1003 19:42:43.109965 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkgmg\" (UniqueName: \"kubernetes.io/projected/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-kube-api-access-hkgmg\") pod \"community-operators-f4bzn\" (UID: \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\") " pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:42:43 crc kubenswrapper[4835]: I1003 19:42:43.212136 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-catalog-content\") pod \"community-operators-f4bzn\" (UID: \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\") " pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:42:43 crc kubenswrapper[4835]: I1003 19:42:43.212257 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-utilities\") pod \"community-operators-f4bzn\" (UID: \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\") " pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:42:43 crc kubenswrapper[4835]: I1003 19:42:43.212330 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkgmg\" (UniqueName: \"kubernetes.io/projected/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-kube-api-access-hkgmg\") pod \"community-operators-f4bzn\" (UID: \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\") " pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:42:43 crc kubenswrapper[4835]: I1003 19:42:43.212956 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-catalog-content\") pod \"community-operators-f4bzn\" (UID: \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\") " pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:42:43 crc kubenswrapper[4835]: I1003 19:42:43.213029 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-utilities\") pod \"community-operators-f4bzn\" (UID: \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\") " pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:42:43 crc kubenswrapper[4835]: I1003 19:42:43.241468 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkgmg\" (UniqueName: \"kubernetes.io/projected/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-kube-api-access-hkgmg\") pod \"community-operators-f4bzn\" (UID: \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\") " pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:42:43 crc kubenswrapper[4835]: I1003 19:42:43.330573 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:42:44 crc kubenswrapper[4835]: I1003 19:42:44.022838 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f4bzn"] Oct 03 19:42:44 crc kubenswrapper[4835]: I1003 19:42:44.408182 4835 generic.go:334] "Generic (PLEG): container finished" podID="ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" containerID="888ccab9a7350dffe28ac4572b3395c5509678c0df81be540894e12414bd5283" exitCode=0 Oct 03 19:42:44 crc kubenswrapper[4835]: I1003 19:42:44.408283 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4bzn" event={"ID":"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0","Type":"ContainerDied","Data":"888ccab9a7350dffe28ac4572b3395c5509678c0df81be540894e12414bd5283"} Oct 03 19:42:44 crc kubenswrapper[4835]: I1003 19:42:44.408702 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4bzn" event={"ID":"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0","Type":"ContainerStarted","Data":"c3639f172c5af8fe31071fff56006c8ee184e0fe6fd44b738d202d33957ddbba"} Oct 03 19:42:45 crc kubenswrapper[4835]: I1003 19:42:45.420298 4835 generic.go:334] "Generic (PLEG): container finished" podID="a1b5fab2-c1eb-4dbd-8924-f7164a634e28" containerID="dc38a47b014ee0fc2cb0bba0bc7d849b69819b3614e7ec6395563940ae961a63" exitCode=0 Oct 03 19:42:45 crc kubenswrapper[4835]: I1003 19:42:45.420373 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6rbvn" event={"ID":"a1b5fab2-c1eb-4dbd-8924-f7164a634e28","Type":"ContainerDied","Data":"dc38a47b014ee0fc2cb0bba0bc7d849b69819b3614e7ec6395563940ae961a63"} Oct 03 19:42:47 crc kubenswrapper[4835]: I1003 19:42:47.448376 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6rbvn" event={"ID":"a1b5fab2-c1eb-4dbd-8924-f7164a634e28","Type":"ContainerStarted","Data":"5c2e5655f62185a9241bedb9b9cdb3095f253fb5e5eac87278d62c4ce8a98bc0"} Oct 03 19:42:47 crc kubenswrapper[4835]: I1003 19:42:47.487515 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6rbvn" podStartSLOduration=4.264199369 podStartE2EDuration="8.487490304s" podCreationTimestamp="2025-10-03 19:42:39 +0000 UTC" firstStartedPulling="2025-10-03 19:42:42.374375596 +0000 UTC m=+5304.090316468" lastFinishedPulling="2025-10-03 19:42:46.597666531 +0000 UTC m=+5308.313607403" observedRunningTime="2025-10-03 19:42:47.474697479 +0000 UTC m=+5309.190638341" watchObservedRunningTime="2025-10-03 19:42:47.487490304 +0000 UTC m=+5309.203431176" Oct 03 19:42:48 crc kubenswrapper[4835]: I1003 19:42:48.470371 4835 generic.go:334] "Generic (PLEG): container finished" podID="ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" containerID="919f52b083da4c3add284f5fdf68193513f78ba16e78ad7153ac12f3e991cfd6" exitCode=0 Oct 03 19:42:48 crc kubenswrapper[4835]: I1003 19:42:48.470452 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4bzn" event={"ID":"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0","Type":"ContainerDied","Data":"919f52b083da4c3add284f5fdf68193513f78ba16e78ad7153ac12f3e991cfd6"} Oct 03 19:42:50 crc kubenswrapper[4835]: I1003 19:42:50.331899 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:50 crc kubenswrapper[4835]: I1003 19:42:50.332930 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:50 crc kubenswrapper[4835]: I1003 19:42:50.391609 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:42:50 crc kubenswrapper[4835]: I1003 19:42:50.502042 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4bzn" event={"ID":"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0","Type":"ContainerStarted","Data":"148a35a75ec8fb0ffe4f44bf2326db1d200ef01b61529d9c68dc4cc399bf4871"} Oct 03 19:42:50 crc kubenswrapper[4835]: I1003 19:42:50.532314 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f4bzn" podStartSLOduration=4.63992686 podStartE2EDuration="8.532288196s" podCreationTimestamp="2025-10-03 19:42:42 +0000 UTC" firstStartedPulling="2025-10-03 19:42:45.427701417 +0000 UTC m=+5307.143642289" lastFinishedPulling="2025-10-03 19:42:49.320062713 +0000 UTC m=+5311.036003625" observedRunningTime="2025-10-03 19:42:50.523729856 +0000 UTC m=+5312.239670748" watchObservedRunningTime="2025-10-03 19:42:50.532288196 +0000 UTC m=+5312.248229068" Oct 03 19:42:51 crc kubenswrapper[4835]: I1003 19:42:51.878109 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:42:51 crc kubenswrapper[4835]: E1003 19:42:51.879603 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:42:53 crc kubenswrapper[4835]: I1003 19:42:53.331580 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:42:53 crc kubenswrapper[4835]: I1003 19:42:53.332107 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:42:53 crc kubenswrapper[4835]: I1003 19:42:53.418715 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:43:00 crc kubenswrapper[4835]: I1003 19:43:00.399997 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:43:00 crc kubenswrapper[4835]: I1003 19:43:00.471671 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6rbvn"] Oct 03 19:43:00 crc kubenswrapper[4835]: I1003 19:43:00.605896 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6rbvn" podUID="a1b5fab2-c1eb-4dbd-8924-f7164a634e28" containerName="registry-server" containerID="cri-o://5c2e5655f62185a9241bedb9b9cdb3095f253fb5e5eac87278d62c4ce8a98bc0" gracePeriod=2 Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.110062 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.263957 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-catalog-content\") pod \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\" (UID: \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\") " Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.264154 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fwsn\" (UniqueName: \"kubernetes.io/projected/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-kube-api-access-8fwsn\") pod \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\" (UID: \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\") " Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.264204 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-utilities\") pod \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\" (UID: \"a1b5fab2-c1eb-4dbd-8924-f7164a634e28\") " Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.265506 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-utilities" (OuterVolumeSpecName: "utilities") pod "a1b5fab2-c1eb-4dbd-8924-f7164a634e28" (UID: "a1b5fab2-c1eb-4dbd-8924-f7164a634e28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.273469 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-kube-api-access-8fwsn" (OuterVolumeSpecName: "kube-api-access-8fwsn") pod "a1b5fab2-c1eb-4dbd-8924-f7164a634e28" (UID: "a1b5fab2-c1eb-4dbd-8924-f7164a634e28"). InnerVolumeSpecName "kube-api-access-8fwsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.300630 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1b5fab2-c1eb-4dbd-8924-f7164a634e28" (UID: "a1b5fab2-c1eb-4dbd-8924-f7164a634e28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.367597 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.367639 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fwsn\" (UniqueName: \"kubernetes.io/projected/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-kube-api-access-8fwsn\") on node \"crc\" DevicePath \"\"" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.367651 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b5fab2-c1eb-4dbd-8924-f7164a634e28-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.623070 4835 generic.go:334] "Generic (PLEG): container finished" podID="a1b5fab2-c1eb-4dbd-8924-f7164a634e28" containerID="5c2e5655f62185a9241bedb9b9cdb3095f253fb5e5eac87278d62c4ce8a98bc0" exitCode=0 Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.623145 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6rbvn" event={"ID":"a1b5fab2-c1eb-4dbd-8924-f7164a634e28","Type":"ContainerDied","Data":"5c2e5655f62185a9241bedb9b9cdb3095f253fb5e5eac87278d62c4ce8a98bc0"} Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.623217 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6rbvn" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.623244 4835 scope.go:117] "RemoveContainer" containerID="5c2e5655f62185a9241bedb9b9cdb3095f253fb5e5eac87278d62c4ce8a98bc0" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.623228 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6rbvn" event={"ID":"a1b5fab2-c1eb-4dbd-8924-f7164a634e28","Type":"ContainerDied","Data":"fc4feba48fc21f2c55eda048f40f8ff32e93a59346a6532fc60ec9f00a24b37a"} Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.684997 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6rbvn"] Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.687628 4835 scope.go:117] "RemoveContainer" containerID="dc38a47b014ee0fc2cb0bba0bc7d849b69819b3614e7ec6395563940ae961a63" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.696567 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6rbvn"] Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.733187 4835 scope.go:117] "RemoveContainer" containerID="d24f0ba9bc676cd8f0fd3553cf0b682ad23465aa4d7847659c4f115b2c845dd6" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.802873 4835 scope.go:117] "RemoveContainer" containerID="5c2e5655f62185a9241bedb9b9cdb3095f253fb5e5eac87278d62c4ce8a98bc0" Oct 03 19:43:01 crc kubenswrapper[4835]: E1003 19:43:01.803151 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2e5655f62185a9241bedb9b9cdb3095f253fb5e5eac87278d62c4ce8a98bc0\": container with ID starting with 5c2e5655f62185a9241bedb9b9cdb3095f253fb5e5eac87278d62c4ce8a98bc0 not found: ID does not exist" containerID="5c2e5655f62185a9241bedb9b9cdb3095f253fb5e5eac87278d62c4ce8a98bc0" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.803192 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2e5655f62185a9241bedb9b9cdb3095f253fb5e5eac87278d62c4ce8a98bc0"} err="failed to get container status \"5c2e5655f62185a9241bedb9b9cdb3095f253fb5e5eac87278d62c4ce8a98bc0\": rpc error: code = NotFound desc = could not find container \"5c2e5655f62185a9241bedb9b9cdb3095f253fb5e5eac87278d62c4ce8a98bc0\": container with ID starting with 5c2e5655f62185a9241bedb9b9cdb3095f253fb5e5eac87278d62c4ce8a98bc0 not found: ID does not exist" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.803223 4835 scope.go:117] "RemoveContainer" containerID="dc38a47b014ee0fc2cb0bba0bc7d849b69819b3614e7ec6395563940ae961a63" Oct 03 19:43:01 crc kubenswrapper[4835]: E1003 19:43:01.803617 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc38a47b014ee0fc2cb0bba0bc7d849b69819b3614e7ec6395563940ae961a63\": container with ID starting with dc38a47b014ee0fc2cb0bba0bc7d849b69819b3614e7ec6395563940ae961a63 not found: ID does not exist" containerID="dc38a47b014ee0fc2cb0bba0bc7d849b69819b3614e7ec6395563940ae961a63" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.803737 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc38a47b014ee0fc2cb0bba0bc7d849b69819b3614e7ec6395563940ae961a63"} err="failed to get container status \"dc38a47b014ee0fc2cb0bba0bc7d849b69819b3614e7ec6395563940ae961a63\": rpc error: code = NotFound desc = could not find container \"dc38a47b014ee0fc2cb0bba0bc7d849b69819b3614e7ec6395563940ae961a63\": container with ID starting with dc38a47b014ee0fc2cb0bba0bc7d849b69819b3614e7ec6395563940ae961a63 not found: ID does not exist" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.803834 4835 scope.go:117] "RemoveContainer" containerID="d24f0ba9bc676cd8f0fd3553cf0b682ad23465aa4d7847659c4f115b2c845dd6" Oct 03 19:43:01 crc kubenswrapper[4835]: E1003 19:43:01.805043 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d24f0ba9bc676cd8f0fd3553cf0b682ad23465aa4d7847659c4f115b2c845dd6\": container with ID starting with d24f0ba9bc676cd8f0fd3553cf0b682ad23465aa4d7847659c4f115b2c845dd6 not found: ID does not exist" containerID="d24f0ba9bc676cd8f0fd3553cf0b682ad23465aa4d7847659c4f115b2c845dd6" Oct 03 19:43:01 crc kubenswrapper[4835]: I1003 19:43:01.805097 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24f0ba9bc676cd8f0fd3553cf0b682ad23465aa4d7847659c4f115b2c845dd6"} err="failed to get container status \"d24f0ba9bc676cd8f0fd3553cf0b682ad23465aa4d7847659c4f115b2c845dd6\": rpc error: code = NotFound desc = could not find container \"d24f0ba9bc676cd8f0fd3553cf0b682ad23465aa4d7847659c4f115b2c845dd6\": container with ID starting with d24f0ba9bc676cd8f0fd3553cf0b682ad23465aa4d7847659c4f115b2c845dd6 not found: ID does not exist" Oct 03 19:43:02 crc kubenswrapper[4835]: I1003 19:43:02.897636 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b5fab2-c1eb-4dbd-8924-f7164a634e28" path="/var/lib/kubelet/pods/a1b5fab2-c1eb-4dbd-8924-f7164a634e28/volumes" Oct 03 19:43:03 crc kubenswrapper[4835]: I1003 19:43:03.404392 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.046813 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f4bzn"] Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.047766 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f4bzn" podUID="ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" containerName="registry-server" containerID="cri-o://148a35a75ec8fb0ffe4f44bf2326db1d200ef01b61529d9c68dc4cc399bf4871" gracePeriod=2 Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.578800 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.647819 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-utilities\") pod \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\" (UID: \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\") " Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.647948 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-catalog-content\") pod \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\" (UID: \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\") " Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.648048 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkgmg\" (UniqueName: \"kubernetes.io/projected/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-kube-api-access-hkgmg\") pod \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\" (UID: \"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0\") " Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.650203 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-utilities" (OuterVolumeSpecName: "utilities") pod "ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" (UID: "ef96c8cc-6f71-469b-8619-d2d50ea1e1d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.655742 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-kube-api-access-hkgmg" (OuterVolumeSpecName: "kube-api-access-hkgmg") pod "ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" (UID: "ef96c8cc-6f71-469b-8619-d2d50ea1e1d0"). InnerVolumeSpecName "kube-api-access-hkgmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.659893 4835 generic.go:334] "Generic (PLEG): container finished" podID="ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" containerID="148a35a75ec8fb0ffe4f44bf2326db1d200ef01b61529d9c68dc4cc399bf4871" exitCode=0 Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.659941 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4bzn" event={"ID":"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0","Type":"ContainerDied","Data":"148a35a75ec8fb0ffe4f44bf2326db1d200ef01b61529d9c68dc4cc399bf4871"} Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.659980 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f4bzn" event={"ID":"ef96c8cc-6f71-469b-8619-d2d50ea1e1d0","Type":"ContainerDied","Data":"c3639f172c5af8fe31071fff56006c8ee184e0fe6fd44b738d202d33957ddbba"} Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.660002 4835 scope.go:117] "RemoveContainer" containerID="148a35a75ec8fb0ffe4f44bf2326db1d200ef01b61529d9c68dc4cc399bf4871" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.660202 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f4bzn" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.701216 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" (UID: "ef96c8cc-6f71-469b-8619-d2d50ea1e1d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.720276 4835 scope.go:117] "RemoveContainer" containerID="919f52b083da4c3add284f5fdf68193513f78ba16e78ad7153ac12f3e991cfd6" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.749215 4835 scope.go:117] "RemoveContainer" containerID="888ccab9a7350dffe28ac4572b3395c5509678c0df81be540894e12414bd5283" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.751343 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkgmg\" (UniqueName: \"kubernetes.io/projected/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-kube-api-access-hkgmg\") on node \"crc\" DevicePath \"\"" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.751373 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.751382 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.811137 4835 scope.go:117] "RemoveContainer" containerID="148a35a75ec8fb0ffe4f44bf2326db1d200ef01b61529d9c68dc4cc399bf4871" Oct 03 19:43:04 crc kubenswrapper[4835]: E1003 19:43:04.811894 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"148a35a75ec8fb0ffe4f44bf2326db1d200ef01b61529d9c68dc4cc399bf4871\": container with ID starting with 148a35a75ec8fb0ffe4f44bf2326db1d200ef01b61529d9c68dc4cc399bf4871 not found: ID does not exist" containerID="148a35a75ec8fb0ffe4f44bf2326db1d200ef01b61529d9c68dc4cc399bf4871" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.811968 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"148a35a75ec8fb0ffe4f44bf2326db1d200ef01b61529d9c68dc4cc399bf4871"} err="failed to get container status \"148a35a75ec8fb0ffe4f44bf2326db1d200ef01b61529d9c68dc4cc399bf4871\": rpc error: code = NotFound desc = could not find container \"148a35a75ec8fb0ffe4f44bf2326db1d200ef01b61529d9c68dc4cc399bf4871\": container with ID starting with 148a35a75ec8fb0ffe4f44bf2326db1d200ef01b61529d9c68dc4cc399bf4871 not found: ID does not exist" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.812014 4835 scope.go:117] "RemoveContainer" containerID="919f52b083da4c3add284f5fdf68193513f78ba16e78ad7153ac12f3e991cfd6" Oct 03 19:43:04 crc kubenswrapper[4835]: E1003 19:43:04.813131 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"919f52b083da4c3add284f5fdf68193513f78ba16e78ad7153ac12f3e991cfd6\": container with ID starting with 919f52b083da4c3add284f5fdf68193513f78ba16e78ad7153ac12f3e991cfd6 not found: ID does not exist" containerID="919f52b083da4c3add284f5fdf68193513f78ba16e78ad7153ac12f3e991cfd6" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.813205 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"919f52b083da4c3add284f5fdf68193513f78ba16e78ad7153ac12f3e991cfd6"} err="failed to get container status \"919f52b083da4c3add284f5fdf68193513f78ba16e78ad7153ac12f3e991cfd6\": rpc error: code = NotFound desc = could not find container \"919f52b083da4c3add284f5fdf68193513f78ba16e78ad7153ac12f3e991cfd6\": container with ID starting with 919f52b083da4c3add284f5fdf68193513f78ba16e78ad7153ac12f3e991cfd6 not found: ID does not exist" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.813241 4835 scope.go:117] "RemoveContainer" containerID="888ccab9a7350dffe28ac4572b3395c5509678c0df81be540894e12414bd5283" Oct 03 19:43:04 crc kubenswrapper[4835]: E1003 19:43:04.813792 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"888ccab9a7350dffe28ac4572b3395c5509678c0df81be540894e12414bd5283\": container with ID starting with 888ccab9a7350dffe28ac4572b3395c5509678c0df81be540894e12414bd5283 not found: ID does not exist" containerID="888ccab9a7350dffe28ac4572b3395c5509678c0df81be540894e12414bd5283" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.813852 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"888ccab9a7350dffe28ac4572b3395c5509678c0df81be540894e12414bd5283"} err="failed to get container status \"888ccab9a7350dffe28ac4572b3395c5509678c0df81be540894e12414bd5283\": rpc error: code = NotFound desc = could not find container \"888ccab9a7350dffe28ac4572b3395c5509678c0df81be540894e12414bd5283\": container with ID starting with 888ccab9a7350dffe28ac4572b3395c5509678c0df81be540894e12414bd5283 not found: ID does not exist" Oct 03 19:43:04 crc kubenswrapper[4835]: I1003 19:43:04.998248 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f4bzn"] Oct 03 19:43:05 crc kubenswrapper[4835]: I1003 19:43:05.010459 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f4bzn"] Oct 03 19:43:05 crc kubenswrapper[4835]: I1003 19:43:05.877144 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:43:05 crc kubenswrapper[4835]: E1003 19:43:05.877876 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:43:06 crc kubenswrapper[4835]: I1003 19:43:06.896519 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" path="/var/lib/kubelet/pods/ef96c8cc-6f71-469b-8619-d2d50ea1e1d0/volumes" Oct 03 19:43:17 crc kubenswrapper[4835]: I1003 19:43:17.877776 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:43:17 crc kubenswrapper[4835]: E1003 19:43:17.878803 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:43:29 crc kubenswrapper[4835]: I1003 19:43:29.876821 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:43:29 crc kubenswrapper[4835]: E1003 19:43:29.877928 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:43:43 crc kubenswrapper[4835]: I1003 19:43:43.877581 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:43:45 crc kubenswrapper[4835]: I1003 19:43:45.155851 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"52bdb132b9879678c3a804484b5f7f9e013178301e27181cb58655f3c44e07cd"} Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.163008 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql"] Oct 03 19:45:00 crc kubenswrapper[4835]: E1003 19:45:00.164370 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b5fab2-c1eb-4dbd-8924-f7164a634e28" containerName="extract-utilities" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.164390 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b5fab2-c1eb-4dbd-8924-f7164a634e28" containerName="extract-utilities" Oct 03 19:45:00 crc kubenswrapper[4835]: E1003 19:45:00.164413 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b5fab2-c1eb-4dbd-8924-f7164a634e28" containerName="extract-content" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.164422 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b5fab2-c1eb-4dbd-8924-f7164a634e28" containerName="extract-content" Oct 03 19:45:00 crc kubenswrapper[4835]: E1003 19:45:00.164432 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" containerName="extract-utilities" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.164440 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" containerName="extract-utilities" Oct 03 19:45:00 crc kubenswrapper[4835]: E1003 19:45:00.164459 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b5fab2-c1eb-4dbd-8924-f7164a634e28" containerName="registry-server" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.164468 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b5fab2-c1eb-4dbd-8924-f7164a634e28" containerName="registry-server" Oct 03 19:45:00 crc kubenswrapper[4835]: E1003 19:45:00.164500 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" containerName="extract-content" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.164507 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" containerName="extract-content" Oct 03 19:45:00 crc kubenswrapper[4835]: E1003 19:45:00.164522 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" containerName="registry-server" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.164530 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" containerName="registry-server" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.164818 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b5fab2-c1eb-4dbd-8924-f7164a634e28" containerName="registry-server" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.164836 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef96c8cc-6f71-469b-8619-d2d50ea1e1d0" containerName="registry-server" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.165991 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.168811 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.169116 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.187714 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql"] Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.315786 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n6x4\" (UniqueName: \"kubernetes.io/projected/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-kube-api-access-6n6x4\") pod \"collect-profiles-29325345-wx2ql\" (UID: \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.316234 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-secret-volume\") pod \"collect-profiles-29325345-wx2ql\" (UID: \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.316481 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-config-volume\") pod \"collect-profiles-29325345-wx2ql\" (UID: \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.419119 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-config-volume\") pod \"collect-profiles-29325345-wx2ql\" (UID: \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.419257 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n6x4\" (UniqueName: \"kubernetes.io/projected/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-kube-api-access-6n6x4\") pod \"collect-profiles-29325345-wx2ql\" (UID: \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.419790 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-secret-volume\") pod \"collect-profiles-29325345-wx2ql\" (UID: \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.421042 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-config-volume\") pod \"collect-profiles-29325345-wx2ql\" (UID: \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.430872 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-secret-volume\") pod \"collect-profiles-29325345-wx2ql\" (UID: \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.441123 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n6x4\" (UniqueName: \"kubernetes.io/projected/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-kube-api-access-6n6x4\") pod \"collect-profiles-29325345-wx2ql\" (UID: \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.499824 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" Oct 03 19:45:00 crc kubenswrapper[4835]: I1003 19:45:00.973905 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql"] Oct 03 19:45:01 crc kubenswrapper[4835]: I1003 19:45:01.077132 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" event={"ID":"f5381f73-cd98-49fd-b2bc-b23f2dc037f0","Type":"ContainerStarted","Data":"5a8b03518f40ecb1bc4cb503b39ae75270afccbe1f7ca7831f458df1fa3b9239"} Oct 03 19:45:02 crc kubenswrapper[4835]: I1003 19:45:02.095400 4835 generic.go:334] "Generic (PLEG): container finished" podID="f5381f73-cd98-49fd-b2bc-b23f2dc037f0" containerID="7963b3b11ccafd5c3203d97905a22ecc60486c0215055cbdae9704024fcff734" exitCode=0 Oct 03 19:45:02 crc kubenswrapper[4835]: I1003 19:45:02.095989 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" event={"ID":"f5381f73-cd98-49fd-b2bc-b23f2dc037f0","Type":"ContainerDied","Data":"7963b3b11ccafd5c3203d97905a22ecc60486c0215055cbdae9704024fcff734"} Oct 03 19:45:03 crc kubenswrapper[4835]: I1003 19:45:03.496977 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" Oct 03 19:45:03 crc kubenswrapper[4835]: I1003 19:45:03.596557 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-secret-volume\") pod \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\" (UID: \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\") " Oct 03 19:45:03 crc kubenswrapper[4835]: I1003 19:45:03.596618 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-config-volume\") pod \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\" (UID: \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\") " Oct 03 19:45:03 crc kubenswrapper[4835]: I1003 19:45:03.596932 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n6x4\" (UniqueName: \"kubernetes.io/projected/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-kube-api-access-6n6x4\") pod \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\" (UID: \"f5381f73-cd98-49fd-b2bc-b23f2dc037f0\") " Oct 03 19:45:03 crc kubenswrapper[4835]: I1003 19:45:03.598136 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-config-volume" (OuterVolumeSpecName: "config-volume") pod "f5381f73-cd98-49fd-b2bc-b23f2dc037f0" (UID: "f5381f73-cd98-49fd-b2bc-b23f2dc037f0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 19:45:03 crc kubenswrapper[4835]: I1003 19:45:03.621607 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-kube-api-access-6n6x4" (OuterVolumeSpecName: "kube-api-access-6n6x4") pod "f5381f73-cd98-49fd-b2bc-b23f2dc037f0" (UID: "f5381f73-cd98-49fd-b2bc-b23f2dc037f0"). InnerVolumeSpecName "kube-api-access-6n6x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:45:03 crc kubenswrapper[4835]: I1003 19:45:03.622415 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f5381f73-cd98-49fd-b2bc-b23f2dc037f0" (UID: "f5381f73-cd98-49fd-b2bc-b23f2dc037f0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:45:03 crc kubenswrapper[4835]: I1003 19:45:03.701710 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n6x4\" (UniqueName: \"kubernetes.io/projected/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-kube-api-access-6n6x4\") on node \"crc\" DevicePath \"\"" Oct 03 19:45:03 crc kubenswrapper[4835]: I1003 19:45:03.702009 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 19:45:03 crc kubenswrapper[4835]: I1003 19:45:03.702084 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5381f73-cd98-49fd-b2bc-b23f2dc037f0-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 19:45:04 crc kubenswrapper[4835]: I1003 19:45:04.125171 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" event={"ID":"f5381f73-cd98-49fd-b2bc-b23f2dc037f0","Type":"ContainerDied","Data":"5a8b03518f40ecb1bc4cb503b39ae75270afccbe1f7ca7831f458df1fa3b9239"} Oct 03 19:45:04 crc kubenswrapper[4835]: I1003 19:45:04.125237 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a8b03518f40ecb1bc4cb503b39ae75270afccbe1f7ca7831f458df1fa3b9239" Oct 03 19:45:04 crc kubenswrapper[4835]: I1003 19:45:04.125790 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325345-wx2ql" Oct 03 19:45:04 crc kubenswrapper[4835]: I1003 19:45:04.597675 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw"] Oct 03 19:45:04 crc kubenswrapper[4835]: I1003 19:45:04.607059 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325300-xqclw"] Oct 03 19:45:04 crc kubenswrapper[4835]: I1003 19:45:04.893231 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1519990-5da3-4aa7-84d9-248acce94038" path="/var/lib/kubelet/pods/d1519990-5da3-4aa7-84d9-248acce94038/volumes" Oct 03 19:45:45 crc kubenswrapper[4835]: I1003 19:45:45.515505 4835 scope.go:117] "RemoveContainer" containerID="b562636137c5f39f7a7dcb7187f2f699d6e6f77f353715a278f1b0e0da3f92e4" Oct 03 19:46:05 crc kubenswrapper[4835]: I1003 19:46:05.359029 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:46:05 crc kubenswrapper[4835]: I1003 19:46:05.359803 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:46:35 crc kubenswrapper[4835]: I1003 19:46:35.358685 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:46:35 crc kubenswrapper[4835]: I1003 19:46:35.359458 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:47:05 crc kubenswrapper[4835]: I1003 19:47:05.358489 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:47:05 crc kubenswrapper[4835]: I1003 19:47:05.359468 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:47:05 crc kubenswrapper[4835]: I1003 19:47:05.359542 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 19:47:05 crc kubenswrapper[4835]: I1003 19:47:05.360461 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52bdb132b9879678c3a804484b5f7f9e013178301e27181cb58655f3c44e07cd"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 19:47:05 crc kubenswrapper[4835]: I1003 19:47:05.360547 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://52bdb132b9879678c3a804484b5f7f9e013178301e27181cb58655f3c44e07cd" gracePeriod=600 Oct 03 19:47:05 crc kubenswrapper[4835]: I1003 19:47:05.563705 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"52bdb132b9879678c3a804484b5f7f9e013178301e27181cb58655f3c44e07cd"} Oct 03 19:47:05 crc kubenswrapper[4835]: I1003 19:47:05.564262 4835 scope.go:117] "RemoveContainer" containerID="f7719ff6692f46cc363c002268669863fa7daa9e5f034e497dc528c106476ae9" Oct 03 19:47:05 crc kubenswrapper[4835]: I1003 19:47:05.563708 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="52bdb132b9879678c3a804484b5f7f9e013178301e27181cb58655f3c44e07cd" exitCode=0 Oct 03 19:47:06 crc kubenswrapper[4835]: I1003 19:47:06.577951 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385"} Oct 03 19:49:35 crc kubenswrapper[4835]: I1003 19:49:35.359514 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:49:35 crc kubenswrapper[4835]: I1003 19:49:35.360576 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:50:03 crc kubenswrapper[4835]: I1003 19:50:03.727221 4835 generic.go:334] "Generic (PLEG): container finished" podID="87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" containerID="ae2950a8851e4b0ebb401782ce8dbd50cd89b920a0a61a84e362dd26c2ff62fd" exitCode=1 Oct 03 19:50:03 crc kubenswrapper[4835]: I1003 19:50:03.727307 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e","Type":"ContainerDied","Data":"ae2950a8851e4b0ebb401782ce8dbd50cd89b920a0a61a84e362dd26c2ff62fd"} Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.150331 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.164608 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-openstack-config\") pod \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.164686 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-test-operator-ephemeral-workdir\") pod \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.165130 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.165155 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4f9d\" (UniqueName: \"kubernetes.io/projected/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-kube-api-access-s4f9d\") pod \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.165312 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-ca-certs\") pod \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.165339 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-test-operator-ephemeral-temporary\") pod \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.165389 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-ssh-key\") pod \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.165456 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-config-data\") pod \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.165555 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-openstack-config-secret\") pod \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\" (UID: \"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e\") " Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.169257 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" (UID: "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.171120 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" (UID: "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.174353 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-config-data" (OuterVolumeSpecName: "config-data") pod "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" (UID: "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.186669 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-kube-api-access-s4f9d" (OuterVolumeSpecName: "kube-api-access-s4f9d") pod "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" (UID: "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e"). InnerVolumeSpecName "kube-api-access-s4f9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.217495 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" (UID: "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.221955 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" (UID: "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.224359 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" (UID: "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.259239 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" (UID: "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.261543 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" (UID: "87c3be87-c5ee-4d08-a75d-dfeb16c19d7e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.268580 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.268612 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4f9d\" (UniqueName: \"kubernetes.io/projected/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-kube-api-access-s4f9d\") on node \"crc\" DevicePath \"\"" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.268624 4835 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.268660 4835 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.268670 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.268680 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.268689 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.268706 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.268715 4835 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/87c3be87-c5ee-4d08-a75d-dfeb16c19d7e-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.294144 4835 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.359395 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.359457 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.372850 4835 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.754979 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"87c3be87-c5ee-4d08-a75d-dfeb16c19d7e","Type":"ContainerDied","Data":"dd8a57745bb8e2352003022b139b0dd846c99dd4ec56d2ded2923ba735cf35e5"} Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.755021 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd8a57745bb8e2352003022b139b0dd846c99dd4ec56d2ded2923ba735cf35e5" Oct 03 19:50:05 crc kubenswrapper[4835]: I1003 19:50:05.755034 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.368960 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 03 19:50:14 crc kubenswrapper[4835]: E1003 19:50:14.376256 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" containerName="tempest-tests-tempest-tests-runner" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.376303 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" containerName="tempest-tests-tempest-tests-runner" Oct 03 19:50:14 crc kubenswrapper[4835]: E1003 19:50:14.376347 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5381f73-cd98-49fd-b2bc-b23f2dc037f0" containerName="collect-profiles" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.376360 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5381f73-cd98-49fd-b2bc-b23f2dc037f0" containerName="collect-profiles" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.376775 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5381f73-cd98-49fd-b2bc-b23f2dc037f0" containerName="collect-profiles" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.376811 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c3be87-c5ee-4d08-a75d-dfeb16c19d7e" containerName="tempest-tests-tempest-tests-runner" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.378722 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.378872 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.382336 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xkg98" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.502270 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"85d214f1-e043-48fb-84e4-ef0f6134fac3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.502697 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vptfq\" (UniqueName: \"kubernetes.io/projected/85d214f1-e043-48fb-84e4-ef0f6134fac3-kube-api-access-vptfq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"85d214f1-e043-48fb-84e4-ef0f6134fac3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.605159 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vptfq\" (UniqueName: \"kubernetes.io/projected/85d214f1-e043-48fb-84e4-ef0f6134fac3-kube-api-access-vptfq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"85d214f1-e043-48fb-84e4-ef0f6134fac3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.605286 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"85d214f1-e043-48fb-84e4-ef0f6134fac3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.606144 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"85d214f1-e043-48fb-84e4-ef0f6134fac3\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.654066 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vptfq\" (UniqueName: \"kubernetes.io/projected/85d214f1-e043-48fb-84e4-ef0f6134fac3-kube-api-access-vptfq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"85d214f1-e043-48fb-84e4-ef0f6134fac3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.656001 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"85d214f1-e043-48fb-84e4-ef0f6134fac3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 19:50:14 crc kubenswrapper[4835]: I1003 19:50:14.703792 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 19:50:15 crc kubenswrapper[4835]: I1003 19:50:15.259927 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 03 19:50:15 crc kubenswrapper[4835]: I1003 19:50:15.274490 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 19:50:15 crc kubenswrapper[4835]: I1003 19:50:15.882583 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"85d214f1-e043-48fb-84e4-ef0f6134fac3","Type":"ContainerStarted","Data":"9f5fea83f10b8850b6792343c71190460990023a2f08d7f4a58e436446d116ba"} Oct 03 19:50:16 crc kubenswrapper[4835]: I1003 19:50:16.898347 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"85d214f1-e043-48fb-84e4-ef0f6134fac3","Type":"ContainerStarted","Data":"a314a9dca9037fa4b9d15bd667c506c9db68c4b3de47809feff329fa9f5113b0"} Oct 03 19:50:16 crc kubenswrapper[4835]: I1003 19:50:16.928678 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.778696712 podStartE2EDuration="2.928647863s" podCreationTimestamp="2025-10-03 19:50:14 +0000 UTC" firstStartedPulling="2025-10-03 19:50:15.274272877 +0000 UTC m=+5756.990213749" lastFinishedPulling="2025-10-03 19:50:16.424224028 +0000 UTC m=+5758.140164900" observedRunningTime="2025-10-03 19:50:16.915129031 +0000 UTC m=+5758.631069903" watchObservedRunningTime="2025-10-03 19:50:16.928647863 +0000 UTC m=+5758.644588745" Oct 03 19:50:35 crc kubenswrapper[4835]: I1003 19:50:35.358766 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:50:35 crc kubenswrapper[4835]: I1003 19:50:35.359841 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:50:35 crc kubenswrapper[4835]: I1003 19:50:35.359947 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 19:50:35 crc kubenswrapper[4835]: I1003 19:50:35.362160 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 19:50:35 crc kubenswrapper[4835]: I1003 19:50:35.362299 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" gracePeriod=600 Oct 03 19:50:35 crc kubenswrapper[4835]: E1003 19:50:35.496223 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:50:36 crc kubenswrapper[4835]: I1003 19:50:36.163999 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" exitCode=0 Oct 03 19:50:36 crc kubenswrapper[4835]: I1003 19:50:36.164311 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385"} Oct 03 19:50:36 crc kubenswrapper[4835]: I1003 19:50:36.164946 4835 scope.go:117] "RemoveContainer" containerID="52bdb132b9879678c3a804484b5f7f9e013178301e27181cb58655f3c44e07cd" Oct 03 19:50:36 crc kubenswrapper[4835]: I1003 19:50:36.167145 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:50:36 crc kubenswrapper[4835]: E1003 19:50:36.167870 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:50:38 crc kubenswrapper[4835]: I1003 19:50:38.535031 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j7c72/must-gather-68jml"] Oct 03 19:50:38 crc kubenswrapper[4835]: I1003 19:50:38.537655 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/must-gather-68jml" Oct 03 19:50:38 crc kubenswrapper[4835]: I1003 19:50:38.549093 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-j7c72"/"openshift-service-ca.crt" Oct 03 19:50:38 crc kubenswrapper[4835]: I1003 19:50:38.549144 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-j7c72"/"kube-root-ca.crt" Oct 03 19:50:38 crc kubenswrapper[4835]: I1003 19:50:38.549107 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-j7c72"/"default-dockercfg-lwhxr" Oct 03 19:50:38 crc kubenswrapper[4835]: I1003 19:50:38.551126 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j7c72/must-gather-68jml"] Oct 03 19:50:38 crc kubenswrapper[4835]: I1003 19:50:38.657020 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqpf9\" (UniqueName: \"kubernetes.io/projected/ce1a2011-32e6-44ba-840e-d840da2bf0f3-kube-api-access-zqpf9\") pod \"must-gather-68jml\" (UID: \"ce1a2011-32e6-44ba-840e-d840da2bf0f3\") " pod="openshift-must-gather-j7c72/must-gather-68jml" Oct 03 19:50:38 crc kubenswrapper[4835]: I1003 19:50:38.657088 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ce1a2011-32e6-44ba-840e-d840da2bf0f3-must-gather-output\") pod \"must-gather-68jml\" (UID: \"ce1a2011-32e6-44ba-840e-d840da2bf0f3\") " pod="openshift-must-gather-j7c72/must-gather-68jml" Oct 03 19:50:38 crc kubenswrapper[4835]: I1003 19:50:38.759037 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqpf9\" (UniqueName: \"kubernetes.io/projected/ce1a2011-32e6-44ba-840e-d840da2bf0f3-kube-api-access-zqpf9\") pod \"must-gather-68jml\" (UID: \"ce1a2011-32e6-44ba-840e-d840da2bf0f3\") " pod="openshift-must-gather-j7c72/must-gather-68jml" Oct 03 19:50:38 crc kubenswrapper[4835]: I1003 19:50:38.759106 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ce1a2011-32e6-44ba-840e-d840da2bf0f3-must-gather-output\") pod \"must-gather-68jml\" (UID: \"ce1a2011-32e6-44ba-840e-d840da2bf0f3\") " pod="openshift-must-gather-j7c72/must-gather-68jml" Oct 03 19:50:38 crc kubenswrapper[4835]: I1003 19:50:38.759615 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ce1a2011-32e6-44ba-840e-d840da2bf0f3-must-gather-output\") pod \"must-gather-68jml\" (UID: \"ce1a2011-32e6-44ba-840e-d840da2bf0f3\") " pod="openshift-must-gather-j7c72/must-gather-68jml" Oct 03 19:50:38 crc kubenswrapper[4835]: I1003 19:50:38.780660 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqpf9\" (UniqueName: \"kubernetes.io/projected/ce1a2011-32e6-44ba-840e-d840da2bf0f3-kube-api-access-zqpf9\") pod \"must-gather-68jml\" (UID: \"ce1a2011-32e6-44ba-840e-d840da2bf0f3\") " pod="openshift-must-gather-j7c72/must-gather-68jml" Oct 03 19:50:38 crc kubenswrapper[4835]: I1003 19:50:38.856605 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/must-gather-68jml" Oct 03 19:50:39 crc kubenswrapper[4835]: I1003 19:50:39.393020 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-j7c72/must-gather-68jml"] Oct 03 19:50:39 crc kubenswrapper[4835]: W1003 19:50:39.393672 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce1a2011_32e6_44ba_840e_d840da2bf0f3.slice/crio-1cd1aa5598c03c14debcd0b4836152491d91d0d313286655e6b9d5e6344c7c55 WatchSource:0}: Error finding container 1cd1aa5598c03c14debcd0b4836152491d91d0d313286655e6b9d5e6344c7c55: Status 404 returned error can't find the container with id 1cd1aa5598c03c14debcd0b4836152491d91d0d313286655e6b9d5e6344c7c55 Oct 03 19:50:40 crc kubenswrapper[4835]: I1003 19:50:40.257060 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j7c72/must-gather-68jml" event={"ID":"ce1a2011-32e6-44ba-840e-d840da2bf0f3","Type":"ContainerStarted","Data":"1cd1aa5598c03c14debcd0b4836152491d91d0d313286655e6b9d5e6344c7c55"} Oct 03 19:50:47 crc kubenswrapper[4835]: I1003 19:50:47.360718 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j7c72/must-gather-68jml" event={"ID":"ce1a2011-32e6-44ba-840e-d840da2bf0f3","Type":"ContainerStarted","Data":"d1e8a51b6bc0846118d361d712b68ddbd31dc5acdcd360cb57b918fbddaa5c03"} Oct 03 19:50:47 crc kubenswrapper[4835]: I1003 19:50:47.361587 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j7c72/must-gather-68jml" event={"ID":"ce1a2011-32e6-44ba-840e-d840da2bf0f3","Type":"ContainerStarted","Data":"fa6bba39ac7acb899467058c061ef5eab3302964ea92c0890bd50de167c91594"} Oct 03 19:50:47 crc kubenswrapper[4835]: I1003 19:50:47.387882 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j7c72/must-gather-68jml" podStartSLOduration=2.496055662 podStartE2EDuration="9.387858635s" podCreationTimestamp="2025-10-03 19:50:38 +0000 UTC" firstStartedPulling="2025-10-03 19:50:39.396908151 +0000 UTC m=+5781.112849023" lastFinishedPulling="2025-10-03 19:50:46.288711124 +0000 UTC m=+5788.004651996" observedRunningTime="2025-10-03 19:50:47.380251547 +0000 UTC m=+5789.096192429" watchObservedRunningTime="2025-10-03 19:50:47.387858635 +0000 UTC m=+5789.103799517" Oct 03 19:50:50 crc kubenswrapper[4835]: I1003 19:50:50.876832 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:50:50 crc kubenswrapper[4835]: E1003 19:50:50.877659 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:50:51 crc kubenswrapper[4835]: I1003 19:50:51.723906 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j7c72/crc-debug-p6p8s"] Oct 03 19:50:51 crc kubenswrapper[4835]: I1003 19:50:51.726102 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/crc-debug-p6p8s" Oct 03 19:50:51 crc kubenswrapper[4835]: I1003 19:50:51.833232 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crlwt\" (UniqueName: \"kubernetes.io/projected/37e3615e-8339-4df8-913d-b2d42eea5ca8-kube-api-access-crlwt\") pod \"crc-debug-p6p8s\" (UID: \"37e3615e-8339-4df8-913d-b2d42eea5ca8\") " pod="openshift-must-gather-j7c72/crc-debug-p6p8s" Oct 03 19:50:51 crc kubenswrapper[4835]: I1003 19:50:51.833428 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37e3615e-8339-4df8-913d-b2d42eea5ca8-host\") pod \"crc-debug-p6p8s\" (UID: \"37e3615e-8339-4df8-913d-b2d42eea5ca8\") " pod="openshift-must-gather-j7c72/crc-debug-p6p8s" Oct 03 19:50:51 crc kubenswrapper[4835]: I1003 19:50:51.937474 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37e3615e-8339-4df8-913d-b2d42eea5ca8-host\") pod \"crc-debug-p6p8s\" (UID: \"37e3615e-8339-4df8-913d-b2d42eea5ca8\") " pod="openshift-must-gather-j7c72/crc-debug-p6p8s" Oct 03 19:50:51 crc kubenswrapper[4835]: I1003 19:50:51.937597 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37e3615e-8339-4df8-913d-b2d42eea5ca8-host\") pod \"crc-debug-p6p8s\" (UID: \"37e3615e-8339-4df8-913d-b2d42eea5ca8\") " pod="openshift-must-gather-j7c72/crc-debug-p6p8s" Oct 03 19:50:51 crc kubenswrapper[4835]: I1003 19:50:51.939603 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crlwt\" (UniqueName: \"kubernetes.io/projected/37e3615e-8339-4df8-913d-b2d42eea5ca8-kube-api-access-crlwt\") pod \"crc-debug-p6p8s\" (UID: \"37e3615e-8339-4df8-913d-b2d42eea5ca8\") " pod="openshift-must-gather-j7c72/crc-debug-p6p8s" Oct 03 19:50:51 crc kubenswrapper[4835]: I1003 19:50:51.962404 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crlwt\" (UniqueName: \"kubernetes.io/projected/37e3615e-8339-4df8-913d-b2d42eea5ca8-kube-api-access-crlwt\") pod \"crc-debug-p6p8s\" (UID: \"37e3615e-8339-4df8-913d-b2d42eea5ca8\") " pod="openshift-must-gather-j7c72/crc-debug-p6p8s" Oct 03 19:50:52 crc kubenswrapper[4835]: I1003 19:50:52.046170 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/crc-debug-p6p8s" Oct 03 19:50:52 crc kubenswrapper[4835]: I1003 19:50:52.420704 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j7c72/crc-debug-p6p8s" event={"ID":"37e3615e-8339-4df8-913d-b2d42eea5ca8","Type":"ContainerStarted","Data":"1e1e77f92755c64eb5a2e91ebf792ae162829bd8ee55ce341abac7c5773ac1e3"} Oct 03 19:51:04 crc kubenswrapper[4835]: I1003 19:51:04.570203 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j7c72/crc-debug-p6p8s" event={"ID":"37e3615e-8339-4df8-913d-b2d42eea5ca8","Type":"ContainerStarted","Data":"ff4407541c28b6a0560cc5f03b058a0838f9250417d895952203cc810d9abc9c"} Oct 03 19:51:04 crc kubenswrapper[4835]: I1003 19:51:04.595417 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-j7c72/crc-debug-p6p8s" podStartSLOduration=1.878460306 podStartE2EDuration="13.595393454s" podCreationTimestamp="2025-10-03 19:50:51 +0000 UTC" firstStartedPulling="2025-10-03 19:50:52.097288594 +0000 UTC m=+5793.813229466" lastFinishedPulling="2025-10-03 19:51:03.814221742 +0000 UTC m=+5805.530162614" observedRunningTime="2025-10-03 19:51:04.59039115 +0000 UTC m=+5806.306332022" watchObservedRunningTime="2025-10-03 19:51:04.595393454 +0000 UTC m=+5806.311334316" Oct 03 19:51:04 crc kubenswrapper[4835]: I1003 19:51:04.877298 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:51:04 crc kubenswrapper[4835]: E1003 19:51:04.877979 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:51:17 crc kubenswrapper[4835]: I1003 19:51:17.877605 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:51:17 crc kubenswrapper[4835]: E1003 19:51:17.878761 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:51:29 crc kubenswrapper[4835]: I1003 19:51:29.877680 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:51:29 crc kubenswrapper[4835]: E1003 19:51:29.878841 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:51:41 crc kubenswrapper[4835]: I1003 19:51:41.880253 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:51:41 crc kubenswrapper[4835]: E1003 19:51:41.881361 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:51:56 crc kubenswrapper[4835]: I1003 19:51:56.877889 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:51:56 crc kubenswrapper[4835]: E1003 19:51:56.878909 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:52:11 crc kubenswrapper[4835]: I1003 19:52:11.878335 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:52:11 crc kubenswrapper[4835]: E1003 19:52:11.879304 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:52:16 crc kubenswrapper[4835]: I1003 19:52:16.164968 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xfx86"] Oct 03 19:52:16 crc kubenswrapper[4835]: I1003 19:52:16.168843 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:16 crc kubenswrapper[4835]: I1003 19:52:16.196499 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xfx86"] Oct 03 19:52:16 crc kubenswrapper[4835]: I1003 19:52:16.293813 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03190a1a-5e00-41b1-ba99-945bb8e6fca5-utilities\") pod \"certified-operators-xfx86\" (UID: \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\") " pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:16 crc kubenswrapper[4835]: I1003 19:52:16.294271 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddhzk\" (UniqueName: \"kubernetes.io/projected/03190a1a-5e00-41b1-ba99-945bb8e6fca5-kube-api-access-ddhzk\") pod \"certified-operators-xfx86\" (UID: \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\") " pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:16 crc kubenswrapper[4835]: I1003 19:52:16.294312 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03190a1a-5e00-41b1-ba99-945bb8e6fca5-catalog-content\") pod \"certified-operators-xfx86\" (UID: \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\") " pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:16 crc kubenswrapper[4835]: I1003 19:52:16.396649 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03190a1a-5e00-41b1-ba99-945bb8e6fca5-utilities\") pod \"certified-operators-xfx86\" (UID: \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\") " pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:16 crc kubenswrapper[4835]: I1003 19:52:16.396793 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddhzk\" (UniqueName: \"kubernetes.io/projected/03190a1a-5e00-41b1-ba99-945bb8e6fca5-kube-api-access-ddhzk\") pod \"certified-operators-xfx86\" (UID: \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\") " pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:16 crc kubenswrapper[4835]: I1003 19:52:16.396832 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03190a1a-5e00-41b1-ba99-945bb8e6fca5-catalog-content\") pod \"certified-operators-xfx86\" (UID: \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\") " pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:16 crc kubenswrapper[4835]: I1003 19:52:16.397342 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03190a1a-5e00-41b1-ba99-945bb8e6fca5-utilities\") pod \"certified-operators-xfx86\" (UID: \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\") " pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:16 crc kubenswrapper[4835]: I1003 19:52:16.397404 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03190a1a-5e00-41b1-ba99-945bb8e6fca5-catalog-content\") pod \"certified-operators-xfx86\" (UID: \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\") " pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:16 crc kubenswrapper[4835]: I1003 19:52:16.430362 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddhzk\" (UniqueName: \"kubernetes.io/projected/03190a1a-5e00-41b1-ba99-945bb8e6fca5-kube-api-access-ddhzk\") pod \"certified-operators-xfx86\" (UID: \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\") " pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:16 crc kubenswrapper[4835]: I1003 19:52:16.491437 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:17 crc kubenswrapper[4835]: I1003 19:52:17.085847 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xfx86"] Oct 03 19:52:17 crc kubenswrapper[4835]: I1003 19:52:17.459912 4835 generic.go:334] "Generic (PLEG): container finished" podID="03190a1a-5e00-41b1-ba99-945bb8e6fca5" containerID="eb8040a9082ef184442e47ae34cb67c89dcb312888b851fe3394bf6de4445779" exitCode=0 Oct 03 19:52:17 crc kubenswrapper[4835]: I1003 19:52:17.460033 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xfx86" event={"ID":"03190a1a-5e00-41b1-ba99-945bb8e6fca5","Type":"ContainerDied","Data":"eb8040a9082ef184442e47ae34cb67c89dcb312888b851fe3394bf6de4445779"} Oct 03 19:52:17 crc kubenswrapper[4835]: I1003 19:52:17.460529 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xfx86" event={"ID":"03190a1a-5e00-41b1-ba99-945bb8e6fca5","Type":"ContainerStarted","Data":"28e01f6a19e6f1e00c387b2ad344c9314f46ea8fbf6545c5aa1ff582b7fc916f"} Oct 03 19:52:18 crc kubenswrapper[4835]: I1003 19:52:18.477829 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xfx86" event={"ID":"03190a1a-5e00-41b1-ba99-945bb8e6fca5","Type":"ContainerStarted","Data":"cfa236de46cf0e6f9571cad237755463960ef447be47bb76e520a75acbd91a1c"} Oct 03 19:52:19 crc kubenswrapper[4835]: I1003 19:52:19.494595 4835 generic.go:334] "Generic (PLEG): container finished" podID="03190a1a-5e00-41b1-ba99-945bb8e6fca5" containerID="cfa236de46cf0e6f9571cad237755463960ef447be47bb76e520a75acbd91a1c" exitCode=0 Oct 03 19:52:19 crc kubenswrapper[4835]: I1003 19:52:19.494691 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xfx86" event={"ID":"03190a1a-5e00-41b1-ba99-945bb8e6fca5","Type":"ContainerDied","Data":"cfa236de46cf0e6f9571cad237755463960ef447be47bb76e520a75acbd91a1c"} Oct 03 19:52:20 crc kubenswrapper[4835]: I1003 19:52:20.551846 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xfx86" event={"ID":"03190a1a-5e00-41b1-ba99-945bb8e6fca5","Type":"ContainerStarted","Data":"5094fb66e6ae69f84c6247181751df2b5bd8ea8b9d0fad8c7c7165ba1dc3eeee"} Oct 03 19:52:20 crc kubenswrapper[4835]: I1003 19:52:20.607151 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xfx86" podStartSLOduration=2.136369283 podStartE2EDuration="4.607101825s" podCreationTimestamp="2025-10-03 19:52:16 +0000 UTC" firstStartedPulling="2025-10-03 19:52:17.46290926 +0000 UTC m=+5879.178850132" lastFinishedPulling="2025-10-03 19:52:19.933641792 +0000 UTC m=+5881.649582674" observedRunningTime="2025-10-03 19:52:20.597813116 +0000 UTC m=+5882.313753988" watchObservedRunningTime="2025-10-03 19:52:20.607101825 +0000 UTC m=+5882.323042697" Oct 03 19:52:24 crc kubenswrapper[4835]: I1003 19:52:24.877451 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:52:24 crc kubenswrapper[4835]: E1003 19:52:24.878280 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:52:25 crc kubenswrapper[4835]: I1003 19:52:25.167670 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-9cdbfcc7d-ccpdw_204b3b60-3ae4-4915-8810-3423d4308efb/barbican-api/0.log" Oct 03 19:52:25 crc kubenswrapper[4835]: I1003 19:52:25.234003 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-9cdbfcc7d-ccpdw_204b3b60-3ae4-4915-8810-3423d4308efb/barbican-api-log/0.log" Oct 03 19:52:25 crc kubenswrapper[4835]: I1003 19:52:25.438536 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8585b7f888-6wk2b_42897a96-1d94-485f-9448-792d48138492/barbican-keystone-listener/0.log" Oct 03 19:52:25 crc kubenswrapper[4835]: I1003 19:52:25.570235 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8585b7f888-6wk2b_42897a96-1d94-485f-9448-792d48138492/barbican-keystone-listener-log/0.log" Oct 03 19:52:25 crc kubenswrapper[4835]: I1003 19:52:25.728156 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-649c499755-ttlj6_9343cae5-9ae1-4cae-b5a1-31acc9b34217/barbican-worker/0.log" Oct 03 19:52:25 crc kubenswrapper[4835]: I1003 19:52:25.831560 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-649c499755-ttlj6_9343cae5-9ae1-4cae-b5a1-31acc9b34217/barbican-worker-log/0.log" Oct 03 19:52:26 crc kubenswrapper[4835]: I1003 19:52:26.007181 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-2n9ns_99ce37b7-29b9-44ed-a066-bc503ae35b61/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:26 crc kubenswrapper[4835]: I1003 19:52:26.260806 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1d393ce1-3377-4099-9948-423066ae1ee5/ceilometer-central-agent/0.log" Oct 03 19:52:26 crc kubenswrapper[4835]: I1003 19:52:26.299986 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1d393ce1-3377-4099-9948-423066ae1ee5/ceilometer-notification-agent/0.log" Oct 03 19:52:26 crc kubenswrapper[4835]: I1003 19:52:26.397032 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1d393ce1-3377-4099-9948-423066ae1ee5/proxy-httpd/0.log" Oct 03 19:52:26 crc kubenswrapper[4835]: I1003 19:52:26.492300 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:26 crc kubenswrapper[4835]: I1003 19:52:26.492367 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:26 crc kubenswrapper[4835]: I1003 19:52:26.513287 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1d393ce1-3377-4099-9948-423066ae1ee5/sg-core/0.log" Oct 03 19:52:26 crc kubenswrapper[4835]: I1003 19:52:26.555416 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:26 crc kubenswrapper[4835]: I1003 19:52:26.673945 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:26 crc kubenswrapper[4835]: I1003 19:52:26.782435 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d985d9ce-6643-4a1f-a889-4a61beb59bfa/cinder-api-log/0.log" Oct 03 19:52:26 crc kubenswrapper[4835]: I1003 19:52:26.814986 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xfx86"] Oct 03 19:52:26 crc kubenswrapper[4835]: I1003 19:52:26.864603 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d985d9ce-6643-4a1f-a889-4a61beb59bfa/cinder-api/0.log" Oct 03 19:52:27 crc kubenswrapper[4835]: I1003 19:52:27.036396 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b57fa1fd-eb1f-4c40-a153-4e6f48698ab8/cinder-scheduler/0.log" Oct 03 19:52:27 crc kubenswrapper[4835]: I1003 19:52:27.192942 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b57fa1fd-eb1f-4c40-a153-4e6f48698ab8/probe/0.log" Oct 03 19:52:27 crc kubenswrapper[4835]: I1003 19:52:27.309770 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-8bb8d_2d0143f0-3796-4a3e-985b-7c240fd0158b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:27 crc kubenswrapper[4835]: I1003 19:52:27.440266 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mgpp2_2ee17011-d405-4e45-84c9-b48eb4ec6820/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:27 crc kubenswrapper[4835]: I1003 19:52:27.682904 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-nqjn5_d8b1395f-37ef-43aa-94b2-4a761f358242/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:27 crc kubenswrapper[4835]: I1003 19:52:27.885896 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5f444fb569-rpskt_c89c5f86-48a0-4bc7-9052-806376312506/init/0.log" Oct 03 19:52:28 crc kubenswrapper[4835]: I1003 19:52:28.052198 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5f444fb569-rpskt_c89c5f86-48a0-4bc7-9052-806376312506/init/0.log" Oct 03 19:52:28 crc kubenswrapper[4835]: I1003 19:52:28.242146 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5f444fb569-rpskt_c89c5f86-48a0-4bc7-9052-806376312506/dnsmasq-dns/0.log" Oct 03 19:52:28 crc kubenswrapper[4835]: I1003 19:52:28.314145 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-wzjgm_040bbc22-68da-4384-981c-4b7716352d49/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:28 crc kubenswrapper[4835]: I1003 19:52:28.484945 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_6243f0de-fd35-43f6-8eaa-63836b03e125/glance-httpd/0.log" Oct 03 19:52:28 crc kubenswrapper[4835]: I1003 19:52:28.508839 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_6243f0de-fd35-43f6-8eaa-63836b03e125/glance-log/0.log" Oct 03 19:52:28 crc kubenswrapper[4835]: I1003 19:52:28.632411 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_399dbdfc-97f0-4d54-9e3e-a18ae490838c/glance-httpd/0.log" Oct 03 19:52:28 crc kubenswrapper[4835]: I1003 19:52:28.641032 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xfx86" podUID="03190a1a-5e00-41b1-ba99-945bb8e6fca5" containerName="registry-server" containerID="cri-o://5094fb66e6ae69f84c6247181751df2b5bd8ea8b9d0fad8c7c7165ba1dc3eeee" gracePeriod=2 Oct 03 19:52:28 crc kubenswrapper[4835]: I1003 19:52:28.720363 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_399dbdfc-97f0-4d54-9e3e-a18ae490838c/glance-log/0.log" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.274919 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-84859df966-b4t26_dbf2013d-5dc5-4fe6-a408-08757b74ecc8/horizon/0.log" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.326488 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.395850 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-7l67z_e7e6dfcd-b745-4ff8-bf81-023fdc9a66f2/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.433688 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03190a1a-5e00-41b1-ba99-945bb8e6fca5-utilities\") pod \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\" (UID: \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\") " Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.433839 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03190a1a-5e00-41b1-ba99-945bb8e6fca5-catalog-content\") pod \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\" (UID: \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\") " Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.433925 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddhzk\" (UniqueName: \"kubernetes.io/projected/03190a1a-5e00-41b1-ba99-945bb8e6fca5-kube-api-access-ddhzk\") pod \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\" (UID: \"03190a1a-5e00-41b1-ba99-945bb8e6fca5\") " Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.436611 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03190a1a-5e00-41b1-ba99-945bb8e6fca5-utilities" (OuterVolumeSpecName: "utilities") pod "03190a1a-5e00-41b1-ba99-945bb8e6fca5" (UID: "03190a1a-5e00-41b1-ba99-945bb8e6fca5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.444819 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03190a1a-5e00-41b1-ba99-945bb8e6fca5-kube-api-access-ddhzk" (OuterVolumeSpecName: "kube-api-access-ddhzk") pod "03190a1a-5e00-41b1-ba99-945bb8e6fca5" (UID: "03190a1a-5e00-41b1-ba99-945bb8e6fca5"). InnerVolumeSpecName "kube-api-access-ddhzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.536300 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03190a1a-5e00-41b1-ba99-945bb8e6fca5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03190a1a-5e00-41b1-ba99-945bb8e6fca5" (UID: "03190a1a-5e00-41b1-ba99-945bb8e6fca5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.539351 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03190a1a-5e00-41b1-ba99-945bb8e6fca5-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.539388 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03190a1a-5e00-41b1-ba99-945bb8e6fca5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.539405 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddhzk\" (UniqueName: \"kubernetes.io/projected/03190a1a-5e00-41b1-ba99-945bb8e6fca5-kube-api-access-ddhzk\") on node \"crc\" DevicePath \"\"" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.634139 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-gpsws_b230c3be-e2a6-49eb-90f7-97732a8be2ad/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.655675 4835 generic.go:334] "Generic (PLEG): container finished" podID="03190a1a-5e00-41b1-ba99-945bb8e6fca5" containerID="5094fb66e6ae69f84c6247181751df2b5bd8ea8b9d0fad8c7c7165ba1dc3eeee" exitCode=0 Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.655735 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xfx86" event={"ID":"03190a1a-5e00-41b1-ba99-945bb8e6fca5","Type":"ContainerDied","Data":"5094fb66e6ae69f84c6247181751df2b5bd8ea8b9d0fad8c7c7165ba1dc3eeee"} Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.655772 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xfx86" event={"ID":"03190a1a-5e00-41b1-ba99-945bb8e6fca5","Type":"ContainerDied","Data":"28e01f6a19e6f1e00c387b2ad344c9314f46ea8fbf6545c5aa1ff582b7fc916f"} Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.655796 4835 scope.go:117] "RemoveContainer" containerID="5094fb66e6ae69f84c6247181751df2b5bd8ea8b9d0fad8c7c7165ba1dc3eeee" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.655980 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xfx86" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.693629 4835 scope.go:117] "RemoveContainer" containerID="cfa236de46cf0e6f9571cad237755463960ef447be47bb76e520a75acbd91a1c" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.722734 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xfx86"] Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.731874 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xfx86"] Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.740724 4835 scope.go:117] "RemoveContainer" containerID="eb8040a9082ef184442e47ae34cb67c89dcb312888b851fe3394bf6de4445779" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.775327 4835 scope.go:117] "RemoveContainer" containerID="5094fb66e6ae69f84c6247181751df2b5bd8ea8b9d0fad8c7c7165ba1dc3eeee" Oct 03 19:52:29 crc kubenswrapper[4835]: E1003 19:52:29.776223 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5094fb66e6ae69f84c6247181751df2b5bd8ea8b9d0fad8c7c7165ba1dc3eeee\": container with ID starting with 5094fb66e6ae69f84c6247181751df2b5bd8ea8b9d0fad8c7c7165ba1dc3eeee not found: ID does not exist" containerID="5094fb66e6ae69f84c6247181751df2b5bd8ea8b9d0fad8c7c7165ba1dc3eeee" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.776295 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5094fb66e6ae69f84c6247181751df2b5bd8ea8b9d0fad8c7c7165ba1dc3eeee"} err="failed to get container status \"5094fb66e6ae69f84c6247181751df2b5bd8ea8b9d0fad8c7c7165ba1dc3eeee\": rpc error: code = NotFound desc = could not find container \"5094fb66e6ae69f84c6247181751df2b5bd8ea8b9d0fad8c7c7165ba1dc3eeee\": container with ID starting with 5094fb66e6ae69f84c6247181751df2b5bd8ea8b9d0fad8c7c7165ba1dc3eeee not found: ID does not exist" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.776336 4835 scope.go:117] "RemoveContainer" containerID="cfa236de46cf0e6f9571cad237755463960ef447be47bb76e520a75acbd91a1c" Oct 03 19:52:29 crc kubenswrapper[4835]: E1003 19:52:29.776949 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa236de46cf0e6f9571cad237755463960ef447be47bb76e520a75acbd91a1c\": container with ID starting with cfa236de46cf0e6f9571cad237755463960ef447be47bb76e520a75acbd91a1c not found: ID does not exist" containerID="cfa236de46cf0e6f9571cad237755463960ef447be47bb76e520a75acbd91a1c" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.776992 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa236de46cf0e6f9571cad237755463960ef447be47bb76e520a75acbd91a1c"} err="failed to get container status \"cfa236de46cf0e6f9571cad237755463960ef447be47bb76e520a75acbd91a1c\": rpc error: code = NotFound desc = could not find container \"cfa236de46cf0e6f9571cad237755463960ef447be47bb76e520a75acbd91a1c\": container with ID starting with cfa236de46cf0e6f9571cad237755463960ef447be47bb76e520a75acbd91a1c not found: ID does not exist" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.777036 4835 scope.go:117] "RemoveContainer" containerID="eb8040a9082ef184442e47ae34cb67c89dcb312888b851fe3394bf6de4445779" Oct 03 19:52:29 crc kubenswrapper[4835]: E1003 19:52:29.777418 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb8040a9082ef184442e47ae34cb67c89dcb312888b851fe3394bf6de4445779\": container with ID starting with eb8040a9082ef184442e47ae34cb67c89dcb312888b851fe3394bf6de4445779 not found: ID does not exist" containerID="eb8040a9082ef184442e47ae34cb67c89dcb312888b851fe3394bf6de4445779" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.777438 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8040a9082ef184442e47ae34cb67c89dcb312888b851fe3394bf6de4445779"} err="failed to get container status \"eb8040a9082ef184442e47ae34cb67c89dcb312888b851fe3394bf6de4445779\": rpc error: code = NotFound desc = could not find container \"eb8040a9082ef184442e47ae34cb67c89dcb312888b851fe3394bf6de4445779\": container with ID starting with eb8040a9082ef184442e47ae34cb67c89dcb312888b851fe3394bf6de4445779 not found: ID does not exist" Oct 03 19:52:29 crc kubenswrapper[4835]: I1003 19:52:29.875574 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-84859df966-b4t26_dbf2013d-5dc5-4fe6-a408-08757b74ecc8/horizon-log/0.log" Oct 03 19:52:30 crc kubenswrapper[4835]: I1003 19:52:30.064520 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29325301-bb85k_99c21b19-0aed-4ab5-9d16-dcfa45e3236c/keystone-cron/0.log" Oct 03 19:52:30 crc kubenswrapper[4835]: I1003 19:52:30.272366 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_a09c4697-168d-4762-8169-e36de57bfd7c/kube-state-metrics/0.log" Oct 03 19:52:30 crc kubenswrapper[4835]: I1003 19:52:30.393099 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5c4db54587-knmn7_522ccc9d-3dab-4ee6-8a2a-882de9d37457/keystone-api/0.log" Oct 03 19:52:30 crc kubenswrapper[4835]: I1003 19:52:30.514717 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-99fng_f6fd81f4-842f-4628-8044-45b76f848087/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:30 crc kubenswrapper[4835]: I1003 19:52:30.907312 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03190a1a-5e00-41b1-ba99-945bb8e6fca5" path="/var/lib/kubelet/pods/03190a1a-5e00-41b1-ba99-945bb8e6fca5/volumes" Oct 03 19:52:31 crc kubenswrapper[4835]: I1003 19:52:31.115053 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-84d4b96669-666zm_c9649777-5191-4e89-a8b0-a164e4998af6/neutron-httpd/0.log" Oct 03 19:52:31 crc kubenswrapper[4835]: I1003 19:52:31.229905 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-84d4b96669-666zm_c9649777-5191-4e89-a8b0-a164e4998af6/neutron-api/0.log" Oct 03 19:52:31 crc kubenswrapper[4835]: I1003 19:52:31.306507 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-w2jwp_440c7001-e0d5-4840-8852-3b1b59285550/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:32 crc kubenswrapper[4835]: I1003 19:52:32.327261 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_28ff1a76-f690-4565-962f-6768463be408/nova-cell0-conductor-conductor/0.log" Oct 03 19:52:33 crc kubenswrapper[4835]: I1003 19:52:33.165212 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_56348f36-94bd-43e7-a6ea-d55206a5ccc3/nova-cell1-conductor-conductor/0.log" Oct 03 19:52:33 crc kubenswrapper[4835]: I1003 19:52:33.608426 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_935cc750-935d-4dc0-8a1c-3bce43a57402/nova-api-log/0.log" Oct 03 19:52:33 crc kubenswrapper[4835]: I1003 19:52:33.848043 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d5a86172-d2db-4c6d-92a3-7a747955f3a4/nova-cell1-novncproxy-novncproxy/0.log" Oct 03 19:52:33 crc kubenswrapper[4835]: I1003 19:52:33.871911 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_935cc750-935d-4dc0-8a1c-3bce43a57402/nova-api-api/0.log" Oct 03 19:52:34 crc kubenswrapper[4835]: I1003 19:52:34.176720 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-h27nb_55d0501b-c32f-4bf7-b52f-e5b941d49926/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:34 crc kubenswrapper[4835]: I1003 19:52:34.268757 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a67c0bec-a5df-4ffa-a903-6f73e88a0d19/nova-metadata-log/0.log" Oct 03 19:52:35 crc kubenswrapper[4835]: I1003 19:52:35.026141 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bc0f2678-65f3-4fb6-959c-73eee3fbf7de/nova-scheduler-scheduler/0.log" Oct 03 19:52:35 crc kubenswrapper[4835]: I1003 19:52:35.161244 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d91a9a1f-a39c-4a80-8bf4-1196bacc8870/mysql-bootstrap/0.log" Oct 03 19:52:35 crc kubenswrapper[4835]: I1003 19:52:35.461842 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d91a9a1f-a39c-4a80-8bf4-1196bacc8870/galera/0.log" Oct 03 19:52:35 crc kubenswrapper[4835]: I1003 19:52:35.477210 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d91a9a1f-a39c-4a80-8bf4-1196bacc8870/mysql-bootstrap/0.log" Oct 03 19:52:35 crc kubenswrapper[4835]: I1003 19:52:35.731320 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_71cb8688-6214-4e5e-a7da-051c5939df65/mysql-bootstrap/0.log" Oct 03 19:52:36 crc kubenswrapper[4835]: I1003 19:52:36.017765 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_71cb8688-6214-4e5e-a7da-051c5939df65/mysql-bootstrap/0.log" Oct 03 19:52:36 crc kubenswrapper[4835]: I1003 19:52:36.103608 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_71cb8688-6214-4e5e-a7da-051c5939df65/galera/0.log" Oct 03 19:52:36 crc kubenswrapper[4835]: I1003 19:52:36.366394 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d72a4931-5c9f-4e46-a8b1-5f0f07b4c643/openstackclient/0.log" Oct 03 19:52:36 crc kubenswrapper[4835]: I1003 19:52:36.919851 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kkp6t_f94fd7e6-3253-4adc-a1f9-188598d9ed3b/openstack-network-exporter/0.log" Oct 03 19:52:37 crc kubenswrapper[4835]: I1003 19:52:37.199401 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gkx58_6ad2fc66-9ffc-4229-a5d0-63c8239c8c69/ovsdb-server-init/0.log" Oct 03 19:52:37 crc kubenswrapper[4835]: I1003 19:52:37.309592 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a67c0bec-a5df-4ffa-a903-6f73e88a0d19/nova-metadata-metadata/0.log" Oct 03 19:52:37 crc kubenswrapper[4835]: I1003 19:52:37.482736 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gkx58_6ad2fc66-9ffc-4229-a5d0-63c8239c8c69/ovsdb-server-init/0.log" Oct 03 19:52:37 crc kubenswrapper[4835]: I1003 19:52:37.601578 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gkx58_6ad2fc66-9ffc-4229-a5d0-63c8239c8c69/ovsdb-server/0.log" Oct 03 19:52:37 crc kubenswrapper[4835]: I1003 19:52:37.784785 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gkx58_6ad2fc66-9ffc-4229-a5d0-63c8239c8c69/ovs-vswitchd/0.log" Oct 03 19:52:37 crc kubenswrapper[4835]: I1003 19:52:37.873322 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-plxn4_c9280b95-ef96-4c58-948f-2abcd7ad8a25/ovn-controller/0.log" Oct 03 19:52:38 crc kubenswrapper[4835]: I1003 19:52:38.057420 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-jq6zq_f0d5e8ab-25a4-4213-9bb4-1e41116eab53/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:38 crc kubenswrapper[4835]: I1003 19:52:38.258410 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d54a09d0-c069-432c-96e9-e742f143e2a9/openstack-network-exporter/0.log" Oct 03 19:52:38 crc kubenswrapper[4835]: I1003 19:52:38.376710 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d54a09d0-c069-432c-96e9-e742f143e2a9/ovn-northd/0.log" Oct 03 19:52:38 crc kubenswrapper[4835]: I1003 19:52:38.557032 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_90ba54d0-8893-4168-a725-993778708104/openstack-network-exporter/0.log" Oct 03 19:52:38 crc kubenswrapper[4835]: I1003 19:52:38.599935 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_90ba54d0-8893-4168-a725-993778708104/ovsdbserver-nb/0.log" Oct 03 19:52:38 crc kubenswrapper[4835]: I1003 19:52:38.804130 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d5b41f3-fbdf-4663-af12-1f55f598de56/openstack-network-exporter/0.log" Oct 03 19:52:38 crc kubenswrapper[4835]: I1003 19:52:38.886849 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:52:38 crc kubenswrapper[4835]: E1003 19:52:38.887191 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:52:38 crc kubenswrapper[4835]: I1003 19:52:38.895155 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d5b41f3-fbdf-4663-af12-1f55f598de56/ovsdbserver-sb/0.log" Oct 03 19:52:39 crc kubenswrapper[4835]: I1003 19:52:39.386190 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-799c89c95d-bzssk_eb97c512-09cc-43f3-8619-7083c7b803ff/placement-api/0.log" Oct 03 19:52:39 crc kubenswrapper[4835]: I1003 19:52:39.464168 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-799c89c95d-bzssk_eb97c512-09cc-43f3-8619-7083c7b803ff/placement-log/0.log" Oct 03 19:52:39 crc kubenswrapper[4835]: I1003 19:52:39.568758 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_07f8f72c-80ef-4fd1-a8d7-8167537568d3/init-config-reloader/0.log" Oct 03 19:52:39 crc kubenswrapper[4835]: I1003 19:52:39.737285 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_07f8f72c-80ef-4fd1-a8d7-8167537568d3/init-config-reloader/0.log" Oct 03 19:52:39 crc kubenswrapper[4835]: I1003 19:52:39.767363 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_07f8f72c-80ef-4fd1-a8d7-8167537568d3/config-reloader/0.log" Oct 03 19:52:39 crc kubenswrapper[4835]: I1003 19:52:39.799844 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_07f8f72c-80ef-4fd1-a8d7-8167537568d3/prometheus/0.log" Oct 03 19:52:40 crc kubenswrapper[4835]: I1003 19:52:40.020906 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_07f8f72c-80ef-4fd1-a8d7-8167537568d3/thanos-sidecar/0.log" Oct 03 19:52:40 crc kubenswrapper[4835]: I1003 19:52:40.021331 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_53873352-044d-4511-b474-6da275dc856e/setup-container/0.log" Oct 03 19:52:40 crc kubenswrapper[4835]: I1003 19:52:40.305563 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_53873352-044d-4511-b474-6da275dc856e/setup-container/0.log" Oct 03 19:52:40 crc kubenswrapper[4835]: I1003 19:52:40.355426 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_53873352-044d-4511-b474-6da275dc856e/rabbitmq/0.log" Oct 03 19:52:40 crc kubenswrapper[4835]: I1003 19:52:40.560732 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_b17ce629-9abd-42ba-8004-cc4b85cee405/setup-container/0.log" Oct 03 19:52:40 crc kubenswrapper[4835]: I1003 19:52:40.764138 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_b17ce629-9abd-42ba-8004-cc4b85cee405/setup-container/0.log" Oct 03 19:52:40 crc kubenswrapper[4835]: I1003 19:52:40.780345 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_b17ce629-9abd-42ba-8004-cc4b85cee405/rabbitmq/0.log" Oct 03 19:52:41 crc kubenswrapper[4835]: I1003 19:52:41.080889 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38b03498-2a1a-4e93-993a-009b39463f69/setup-container/0.log" Oct 03 19:52:41 crc kubenswrapper[4835]: I1003 19:52:41.214057 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38b03498-2a1a-4e93-993a-009b39463f69/setup-container/0.log" Oct 03 19:52:41 crc kubenswrapper[4835]: I1003 19:52:41.285749 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_38b03498-2a1a-4e93-993a-009b39463f69/rabbitmq/0.log" Oct 03 19:52:41 crc kubenswrapper[4835]: I1003 19:52:41.467022 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-mwzqb_ff850717-c781-4037-81d1-d5538ea47f65/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:41 crc kubenswrapper[4835]: I1003 19:52:41.658938 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-4mcgc_87be00bb-8652-4279-a481-d69d219cd882/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:41 crc kubenswrapper[4835]: I1003 19:52:41.910784 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-qmj48_1571f66f-2633-4835-ba3f-db5f52eefb9b/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:42 crc kubenswrapper[4835]: I1003 19:52:42.117720 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-hzlrs_94670192-4404-4657-8f2a-c493b239e2bd/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:42 crc kubenswrapper[4835]: I1003 19:52:42.219111 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dz75p_8579b92c-53fb-4e67-af9b-40881365b520/ssh-known-hosts-edpm-deployment/0.log" Oct 03 19:52:42 crc kubenswrapper[4835]: I1003 19:52:42.491528 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-75bfc94c9f-srgmb_f75702cb-25b4-45f5-a26f-0867f10dc525/proxy-server/0.log" Oct 03 19:52:42 crc kubenswrapper[4835]: I1003 19:52:42.690376 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-75bfc94c9f-srgmb_f75702cb-25b4-45f5-a26f-0867f10dc525/proxy-httpd/0.log" Oct 03 19:52:42 crc kubenswrapper[4835]: I1003 19:52:42.749085 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mflzl_ac2daa29-f8b3-4aa7-a4ac-f75d003fb6f2/swift-ring-rebalance/0.log" Oct 03 19:52:42 crc kubenswrapper[4835]: I1003 19:52:42.972239 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/account-auditor/0.log" Oct 03 19:52:43 crc kubenswrapper[4835]: I1003 19:52:43.012676 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/account-reaper/0.log" Oct 03 19:52:43 crc kubenswrapper[4835]: I1003 19:52:43.219637 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/account-server/0.log" Oct 03 19:52:43 crc kubenswrapper[4835]: I1003 19:52:43.237836 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/account-replicator/0.log" Oct 03 19:52:43 crc kubenswrapper[4835]: I1003 19:52:43.313506 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/container-auditor/0.log" Oct 03 19:52:43 crc kubenswrapper[4835]: I1003 19:52:43.511278 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/container-server/0.log" Oct 03 19:52:43 crc kubenswrapper[4835]: I1003 19:52:43.548022 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/container-replicator/0.log" Oct 03 19:52:43 crc kubenswrapper[4835]: I1003 19:52:43.591714 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/container-updater/0.log" Oct 03 19:52:43 crc kubenswrapper[4835]: I1003 19:52:43.748072 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/object-expirer/0.log" Oct 03 19:52:43 crc kubenswrapper[4835]: I1003 19:52:43.844521 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/object-auditor/0.log" Oct 03 19:52:43 crc kubenswrapper[4835]: I1003 19:52:43.902662 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/object-replicator/0.log" Oct 03 19:52:44 crc kubenswrapper[4835]: I1003 19:52:44.001006 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/object-server/0.log" Oct 03 19:52:44 crc kubenswrapper[4835]: I1003 19:52:44.106134 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/object-updater/0.log" Oct 03 19:52:44 crc kubenswrapper[4835]: I1003 19:52:44.164196 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/rsync/0.log" Oct 03 19:52:44 crc kubenswrapper[4835]: I1003 19:52:44.208942 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8359224-c77a-4d86-878b-6f073225ed33/swift-recon-cron/0.log" Oct 03 19:52:44 crc kubenswrapper[4835]: I1003 19:52:44.514629 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-lgkdp_f2406a66-d20f-4ac5-9817-a1bf1ff38c5d/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:44 crc kubenswrapper[4835]: I1003 19:52:44.772938 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_85d214f1-e043-48fb-84e4-ef0f6134fac3/test-operator-logs-container/0.log" Oct 03 19:52:44 crc kubenswrapper[4835]: I1003 19:52:44.827523 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_87c3be87-c5ee-4d08-a75d-dfeb16c19d7e/tempest-tests-tempest-tests-runner/0.log" Oct 03 19:52:45 crc kubenswrapper[4835]: I1003 19:52:45.101862 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-gcph9_2cb161f9-3a8e-40ef-999e-03b98142d09d/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 19:52:46 crc kubenswrapper[4835]: I1003 19:52:46.254267 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_7a9c92f4-cd5c-4917-8ce8-5619892d5470/watcher-applier/0.log" Oct 03 19:52:46 crc kubenswrapper[4835]: I1003 19:52:46.572188 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_0aa3d7ce-c1f2-40a5-b63b-b39daee108fb/watcher-api-log/0.log" Oct 03 19:52:50 crc kubenswrapper[4835]: I1003 19:52:50.719086 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_929877a5-090b-46c5-ac19-f2ba3c72231f/watcher-decision-engine/0.log" Oct 03 19:52:51 crc kubenswrapper[4835]: I1003 19:52:51.694061 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_0aa3d7ce-c1f2-40a5-b63b-b39daee108fb/watcher-api/0.log" Oct 03 19:52:51 crc kubenswrapper[4835]: I1003 19:52:51.876802 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:52:51 crc kubenswrapper[4835]: E1003 19:52:51.878008 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:52:56 crc kubenswrapper[4835]: I1003 19:52:56.618682 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_69b75d74-7a6f-40ff-9c5c-481ced22eec0/memcached/0.log" Oct 03 19:53:06 crc kubenswrapper[4835]: I1003 19:53:06.877283 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:53:06 crc kubenswrapper[4835]: E1003 19:53:06.878347 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:53:19 crc kubenswrapper[4835]: I1003 19:53:19.878874 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:53:19 crc kubenswrapper[4835]: E1003 19:53:19.879775 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:53:27 crc kubenswrapper[4835]: I1003 19:53:27.952993 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zzv22"] Oct 03 19:53:27 crc kubenswrapper[4835]: E1003 19:53:27.954913 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03190a1a-5e00-41b1-ba99-945bb8e6fca5" containerName="registry-server" Oct 03 19:53:27 crc kubenswrapper[4835]: I1003 19:53:27.954944 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="03190a1a-5e00-41b1-ba99-945bb8e6fca5" containerName="registry-server" Oct 03 19:53:27 crc kubenswrapper[4835]: E1003 19:53:27.954982 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03190a1a-5e00-41b1-ba99-945bb8e6fca5" containerName="extract-utilities" Oct 03 19:53:27 crc kubenswrapper[4835]: I1003 19:53:27.954998 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="03190a1a-5e00-41b1-ba99-945bb8e6fca5" containerName="extract-utilities" Oct 03 19:53:27 crc kubenswrapper[4835]: E1003 19:53:27.955058 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03190a1a-5e00-41b1-ba99-945bb8e6fca5" containerName="extract-content" Oct 03 19:53:27 crc kubenswrapper[4835]: I1003 19:53:27.955115 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="03190a1a-5e00-41b1-ba99-945bb8e6fca5" containerName="extract-content" Oct 03 19:53:27 crc kubenswrapper[4835]: I1003 19:53:27.955556 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="03190a1a-5e00-41b1-ba99-945bb8e6fca5" containerName="registry-server" Oct 03 19:53:27 crc kubenswrapper[4835]: I1003 19:53:27.959462 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:27 crc kubenswrapper[4835]: I1003 19:53:27.984600 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zzv22"] Oct 03 19:53:28 crc kubenswrapper[4835]: I1003 19:53:28.152414 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/273bae10-cb88-49e6-8754-c4828787f3fd-utilities\") pod \"community-operators-zzv22\" (UID: \"273bae10-cb88-49e6-8754-c4828787f3fd\") " pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:28 crc kubenswrapper[4835]: I1003 19:53:28.153555 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/273bae10-cb88-49e6-8754-c4828787f3fd-catalog-content\") pod \"community-operators-zzv22\" (UID: \"273bae10-cb88-49e6-8754-c4828787f3fd\") " pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:28 crc kubenswrapper[4835]: I1003 19:53:28.153823 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmzjl\" (UniqueName: \"kubernetes.io/projected/273bae10-cb88-49e6-8754-c4828787f3fd-kube-api-access-fmzjl\") pod \"community-operators-zzv22\" (UID: \"273bae10-cb88-49e6-8754-c4828787f3fd\") " pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:28 crc kubenswrapper[4835]: I1003 19:53:28.256646 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmzjl\" (UniqueName: \"kubernetes.io/projected/273bae10-cb88-49e6-8754-c4828787f3fd-kube-api-access-fmzjl\") pod \"community-operators-zzv22\" (UID: \"273bae10-cb88-49e6-8754-c4828787f3fd\") " pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:28 crc kubenswrapper[4835]: I1003 19:53:28.256787 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/273bae10-cb88-49e6-8754-c4828787f3fd-utilities\") pod \"community-operators-zzv22\" (UID: \"273bae10-cb88-49e6-8754-c4828787f3fd\") " pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:28 crc kubenswrapper[4835]: I1003 19:53:28.256871 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/273bae10-cb88-49e6-8754-c4828787f3fd-catalog-content\") pod \"community-operators-zzv22\" (UID: \"273bae10-cb88-49e6-8754-c4828787f3fd\") " pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:28 crc kubenswrapper[4835]: I1003 19:53:28.257505 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/273bae10-cb88-49e6-8754-c4828787f3fd-utilities\") pod \"community-operators-zzv22\" (UID: \"273bae10-cb88-49e6-8754-c4828787f3fd\") " pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:28 crc kubenswrapper[4835]: I1003 19:53:28.257627 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/273bae10-cb88-49e6-8754-c4828787f3fd-catalog-content\") pod \"community-operators-zzv22\" (UID: \"273bae10-cb88-49e6-8754-c4828787f3fd\") " pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:28 crc kubenswrapper[4835]: I1003 19:53:28.284557 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmzjl\" (UniqueName: \"kubernetes.io/projected/273bae10-cb88-49e6-8754-c4828787f3fd-kube-api-access-fmzjl\") pod \"community-operators-zzv22\" (UID: \"273bae10-cb88-49e6-8754-c4828787f3fd\") " pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:28 crc kubenswrapper[4835]: I1003 19:53:28.295943 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:28 crc kubenswrapper[4835]: I1003 19:53:28.861987 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zzv22"] Oct 03 19:53:29 crc kubenswrapper[4835]: I1003 19:53:29.378750 4835 generic.go:334] "Generic (PLEG): container finished" podID="273bae10-cb88-49e6-8754-c4828787f3fd" containerID="ef7b6f2afb6f860eaf7f849ea577607fcc7a3ada1ca636ba44f0ddc3373dcdd1" exitCode=0 Oct 03 19:53:29 crc kubenswrapper[4835]: I1003 19:53:29.378853 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzv22" event={"ID":"273bae10-cb88-49e6-8754-c4828787f3fd","Type":"ContainerDied","Data":"ef7b6f2afb6f860eaf7f849ea577607fcc7a3ada1ca636ba44f0ddc3373dcdd1"} Oct 03 19:53:29 crc kubenswrapper[4835]: I1003 19:53:29.379383 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzv22" event={"ID":"273bae10-cb88-49e6-8754-c4828787f3fd","Type":"ContainerStarted","Data":"1e330d59df72407c17858d368c69fa432ebd769a683a6222999b75cbfa770587"} Oct 03 19:53:29 crc kubenswrapper[4835]: I1003 19:53:29.386414 4835 generic.go:334] "Generic (PLEG): container finished" podID="37e3615e-8339-4df8-913d-b2d42eea5ca8" containerID="ff4407541c28b6a0560cc5f03b058a0838f9250417d895952203cc810d9abc9c" exitCode=0 Oct 03 19:53:29 crc kubenswrapper[4835]: I1003 19:53:29.386491 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j7c72/crc-debug-p6p8s" event={"ID":"37e3615e-8339-4df8-913d-b2d42eea5ca8","Type":"ContainerDied","Data":"ff4407541c28b6a0560cc5f03b058a0838f9250417d895952203cc810d9abc9c"} Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.402430 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzv22" event={"ID":"273bae10-cb88-49e6-8754-c4828787f3fd","Type":"ContainerStarted","Data":"38656510da19f882a460fd7b1331f44b9b69cf084b97160ee599514d58fc10ac"} Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.559250 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jhghq"] Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.562947 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/crc-debug-p6p8s" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.563848 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e3615e-8339-4df8-913d-b2d42eea5ca8" containerName="container-00" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.565461 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.583967 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhghq"] Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.640604 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j7c72/crc-debug-p6p8s"] Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.649262 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crlwt\" (UniqueName: \"kubernetes.io/projected/37e3615e-8339-4df8-913d-b2d42eea5ca8-kube-api-access-crlwt\") pod \"37e3615e-8339-4df8-913d-b2d42eea5ca8\" (UID: \"37e3615e-8339-4df8-913d-b2d42eea5ca8\") " Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.649517 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37e3615e-8339-4df8-913d-b2d42eea5ca8-host\") pod \"37e3615e-8339-4df8-913d-b2d42eea5ca8\" (UID: \"37e3615e-8339-4df8-913d-b2d42eea5ca8\") " Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.649612 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37e3615e-8339-4df8-913d-b2d42eea5ca8-host" (OuterVolumeSpecName: "host") pod "37e3615e-8339-4df8-913d-b2d42eea5ca8" (UID: "37e3615e-8339-4df8-913d-b2d42eea5ca8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.650121 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-catalog-content\") pod \"redhat-marketplace-jhghq\" (UID: \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\") " pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.650242 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-utilities\") pod \"redhat-marketplace-jhghq\" (UID: \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\") " pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.650347 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9np\" (UniqueName: \"kubernetes.io/projected/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-kube-api-access-7f9np\") pod \"redhat-marketplace-jhghq\" (UID: \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\") " pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.650441 4835 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37e3615e-8339-4df8-913d-b2d42eea5ca8-host\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.652829 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j7c72/crc-debug-p6p8s"] Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.660645 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e3615e-8339-4df8-913d-b2d42eea5ca8-kube-api-access-crlwt" (OuterVolumeSpecName: "kube-api-access-crlwt") pod "37e3615e-8339-4df8-913d-b2d42eea5ca8" (UID: "37e3615e-8339-4df8-913d-b2d42eea5ca8"). InnerVolumeSpecName "kube-api-access-crlwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.753507 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9np\" (UniqueName: \"kubernetes.io/projected/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-kube-api-access-7f9np\") pod \"redhat-marketplace-jhghq\" (UID: \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\") " pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.753653 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-catalog-content\") pod \"redhat-marketplace-jhghq\" (UID: \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\") " pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.753701 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-utilities\") pod \"redhat-marketplace-jhghq\" (UID: \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\") " pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.753804 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crlwt\" (UniqueName: \"kubernetes.io/projected/37e3615e-8339-4df8-913d-b2d42eea5ca8-kube-api-access-crlwt\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.754348 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-utilities\") pod \"redhat-marketplace-jhghq\" (UID: \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\") " pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.754532 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-catalog-content\") pod \"redhat-marketplace-jhghq\" (UID: \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\") " pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.775991 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9np\" (UniqueName: \"kubernetes.io/projected/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-kube-api-access-7f9np\") pod \"redhat-marketplace-jhghq\" (UID: \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\") " pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:30 crc kubenswrapper[4835]: I1003 19:53:30.890473 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e3615e-8339-4df8-913d-b2d42eea5ca8" path="/var/lib/kubelet/pods/37e3615e-8339-4df8-913d-b2d42eea5ca8/volumes" Oct 03 19:53:31 crc kubenswrapper[4835]: I1003 19:53:31.007983 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:31 crc kubenswrapper[4835]: I1003 19:53:31.416502 4835 generic.go:334] "Generic (PLEG): container finished" podID="273bae10-cb88-49e6-8754-c4828787f3fd" containerID="38656510da19f882a460fd7b1331f44b9b69cf084b97160ee599514d58fc10ac" exitCode=0 Oct 03 19:53:31 crc kubenswrapper[4835]: I1003 19:53:31.416655 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzv22" event={"ID":"273bae10-cb88-49e6-8754-c4828787f3fd","Type":"ContainerDied","Data":"38656510da19f882a460fd7b1331f44b9b69cf084b97160ee599514d58fc10ac"} Oct 03 19:53:31 crc kubenswrapper[4835]: I1003 19:53:31.420969 4835 scope.go:117] "RemoveContainer" containerID="ff4407541c28b6a0560cc5f03b058a0838f9250417d895952203cc810d9abc9c" Oct 03 19:53:31 crc kubenswrapper[4835]: I1003 19:53:31.421066 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/crc-debug-p6p8s" Oct 03 19:53:31 crc kubenswrapper[4835]: I1003 19:53:31.509579 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhghq"] Oct 03 19:53:31 crc kubenswrapper[4835]: W1003 19:53:31.516556 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dadd9e7_0f46_4bb5_aae4_46c518e64ced.slice/crio-92af036c4468758e507fcfb73b1c2ae7939b930810e0bf474339e56cbc8d3c46 WatchSource:0}: Error finding container 92af036c4468758e507fcfb73b1c2ae7939b930810e0bf474339e56cbc8d3c46: Status 404 returned error can't find the container with id 92af036c4468758e507fcfb73b1c2ae7939b930810e0bf474339e56cbc8d3c46 Oct 03 19:53:31 crc kubenswrapper[4835]: I1003 19:53:31.877504 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:53:31 crc kubenswrapper[4835]: E1003 19:53:31.877970 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:53:31 crc kubenswrapper[4835]: I1003 19:53:31.902615 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j7c72/crc-debug-xhrbb"] Oct 03 19:53:31 crc kubenswrapper[4835]: E1003 19:53:31.903301 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e3615e-8339-4df8-913d-b2d42eea5ca8" containerName="container-00" Oct 03 19:53:31 crc kubenswrapper[4835]: I1003 19:53:31.903324 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e3615e-8339-4df8-913d-b2d42eea5ca8" containerName="container-00" Oct 03 19:53:31 crc kubenswrapper[4835]: I1003 19:53:31.904393 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/crc-debug-xhrbb" Oct 03 19:53:31 crc kubenswrapper[4835]: I1003 19:53:31.994227 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3469d761-3379-4702-87e9-ac02f5d19273-host\") pod \"crc-debug-xhrbb\" (UID: \"3469d761-3379-4702-87e9-ac02f5d19273\") " pod="openshift-must-gather-j7c72/crc-debug-xhrbb" Oct 03 19:53:31 crc kubenswrapper[4835]: I1003 19:53:31.994907 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69h76\" (UniqueName: \"kubernetes.io/projected/3469d761-3379-4702-87e9-ac02f5d19273-kube-api-access-69h76\") pod \"crc-debug-xhrbb\" (UID: \"3469d761-3379-4702-87e9-ac02f5d19273\") " pod="openshift-must-gather-j7c72/crc-debug-xhrbb" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.097523 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69h76\" (UniqueName: \"kubernetes.io/projected/3469d761-3379-4702-87e9-ac02f5d19273-kube-api-access-69h76\") pod \"crc-debug-xhrbb\" (UID: \"3469d761-3379-4702-87e9-ac02f5d19273\") " pod="openshift-must-gather-j7c72/crc-debug-xhrbb" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.097712 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3469d761-3379-4702-87e9-ac02f5d19273-host\") pod \"crc-debug-xhrbb\" (UID: \"3469d761-3379-4702-87e9-ac02f5d19273\") " pod="openshift-must-gather-j7c72/crc-debug-xhrbb" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.097899 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3469d761-3379-4702-87e9-ac02f5d19273-host\") pod \"crc-debug-xhrbb\" (UID: \"3469d761-3379-4702-87e9-ac02f5d19273\") " pod="openshift-must-gather-j7c72/crc-debug-xhrbb" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.120881 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69h76\" (UniqueName: \"kubernetes.io/projected/3469d761-3379-4702-87e9-ac02f5d19273-kube-api-access-69h76\") pod \"crc-debug-xhrbb\" (UID: \"3469d761-3379-4702-87e9-ac02f5d19273\") " pod="openshift-must-gather-j7c72/crc-debug-xhrbb" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.232498 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/crc-debug-xhrbb" Oct 03 19:53:32 crc kubenswrapper[4835]: W1003 19:53:32.264025 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3469d761_3379_4702_87e9_ac02f5d19273.slice/crio-2eab30d53887efef4a8b97b63db784fc84d235314c91e3cd3ad7bdf41a4dadb0 WatchSource:0}: Error finding container 2eab30d53887efef4a8b97b63db784fc84d235314c91e3cd3ad7bdf41a4dadb0: Status 404 returned error can't find the container with id 2eab30d53887efef4a8b97b63db784fc84d235314c91e3cd3ad7bdf41a4dadb0 Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.433477 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j7c72/crc-debug-xhrbb" event={"ID":"3469d761-3379-4702-87e9-ac02f5d19273","Type":"ContainerStarted","Data":"2eab30d53887efef4a8b97b63db784fc84d235314c91e3cd3ad7bdf41a4dadb0"} Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.438522 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzv22" event={"ID":"273bae10-cb88-49e6-8754-c4828787f3fd","Type":"ContainerStarted","Data":"e3c82bd2fd0263294a6a972c920df44c752736c2ac9149b33d0e60a81b5dec16"} Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.441160 4835 generic.go:334] "Generic (PLEG): container finished" podID="7dadd9e7-0f46-4bb5-aae4-46c518e64ced" containerID="696b3a401625a497808101483e6c024282fdd556d9827bd1de82e05fa526db9c" exitCode=0 Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.441226 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhghq" event={"ID":"7dadd9e7-0f46-4bb5-aae4-46c518e64ced","Type":"ContainerDied","Data":"696b3a401625a497808101483e6c024282fdd556d9827bd1de82e05fa526db9c"} Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.441659 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhghq" event={"ID":"7dadd9e7-0f46-4bb5-aae4-46c518e64ced","Type":"ContainerStarted","Data":"92af036c4468758e507fcfb73b1c2ae7939b930810e0bf474339e56cbc8d3c46"} Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.469228 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zzv22" podStartSLOduration=2.98330356 podStartE2EDuration="5.469203945s" podCreationTimestamp="2025-10-03 19:53:27 +0000 UTC" firstStartedPulling="2025-10-03 19:53:29.383063198 +0000 UTC m=+5951.099004070" lastFinishedPulling="2025-10-03 19:53:31.868963543 +0000 UTC m=+5953.584904455" observedRunningTime="2025-10-03 19:53:32.460668156 +0000 UTC m=+5954.176609028" watchObservedRunningTime="2025-10-03 19:53:32.469203945 +0000 UTC m=+5954.185144827" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.724390 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f284b"] Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.728962 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.735941 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f284b"] Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.814510 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/314c5c33-739b-4996-b64d-d21be1fb411e-catalog-content\") pod \"redhat-operators-f284b\" (UID: \"314c5c33-739b-4996-b64d-d21be1fb411e\") " pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.814688 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vftkv\" (UniqueName: \"kubernetes.io/projected/314c5c33-739b-4996-b64d-d21be1fb411e-kube-api-access-vftkv\") pod \"redhat-operators-f284b\" (UID: \"314c5c33-739b-4996-b64d-d21be1fb411e\") " pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.814948 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/314c5c33-739b-4996-b64d-d21be1fb411e-utilities\") pod \"redhat-operators-f284b\" (UID: \"314c5c33-739b-4996-b64d-d21be1fb411e\") " pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.917872 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/314c5c33-739b-4996-b64d-d21be1fb411e-catalog-content\") pod \"redhat-operators-f284b\" (UID: \"314c5c33-739b-4996-b64d-d21be1fb411e\") " pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.918858 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/314c5c33-739b-4996-b64d-d21be1fb411e-catalog-content\") pod \"redhat-operators-f284b\" (UID: \"314c5c33-739b-4996-b64d-d21be1fb411e\") " pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.919042 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vftkv\" (UniqueName: \"kubernetes.io/projected/314c5c33-739b-4996-b64d-d21be1fb411e-kube-api-access-vftkv\") pod \"redhat-operators-f284b\" (UID: \"314c5c33-739b-4996-b64d-d21be1fb411e\") " pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.919596 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/314c5c33-739b-4996-b64d-d21be1fb411e-utilities\") pod \"redhat-operators-f284b\" (UID: \"314c5c33-739b-4996-b64d-d21be1fb411e\") " pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.920055 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/314c5c33-739b-4996-b64d-d21be1fb411e-utilities\") pod \"redhat-operators-f284b\" (UID: \"314c5c33-739b-4996-b64d-d21be1fb411e\") " pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:32 crc kubenswrapper[4835]: I1003 19:53:32.950633 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vftkv\" (UniqueName: \"kubernetes.io/projected/314c5c33-739b-4996-b64d-d21be1fb411e-kube-api-access-vftkv\") pod \"redhat-operators-f284b\" (UID: \"314c5c33-739b-4996-b64d-d21be1fb411e\") " pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:33 crc kubenswrapper[4835]: I1003 19:53:33.063289 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:33 crc kubenswrapper[4835]: I1003 19:53:33.453721 4835 generic.go:334] "Generic (PLEG): container finished" podID="3469d761-3379-4702-87e9-ac02f5d19273" containerID="1896120bec1f9e369ea27e31e08b60a5f9c11448e0e5d6720059f4a644b346cf" exitCode=0 Oct 03 19:53:33 crc kubenswrapper[4835]: I1003 19:53:33.453865 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j7c72/crc-debug-xhrbb" event={"ID":"3469d761-3379-4702-87e9-ac02f5d19273","Type":"ContainerDied","Data":"1896120bec1f9e369ea27e31e08b60a5f9c11448e0e5d6720059f4a644b346cf"} Oct 03 19:53:33 crc kubenswrapper[4835]: I1003 19:53:33.618142 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f284b"] Oct 03 19:53:34 crc kubenswrapper[4835]: I1003 19:53:34.471499 4835 generic.go:334] "Generic (PLEG): container finished" podID="314c5c33-739b-4996-b64d-d21be1fb411e" containerID="ed4419e8677457ed3f953c4ea73fa6b89dfb89552b4ed2b3ccff373f5d763cb7" exitCode=0 Oct 03 19:53:34 crc kubenswrapper[4835]: I1003 19:53:34.471601 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f284b" event={"ID":"314c5c33-739b-4996-b64d-d21be1fb411e","Type":"ContainerDied","Data":"ed4419e8677457ed3f953c4ea73fa6b89dfb89552b4ed2b3ccff373f5d763cb7"} Oct 03 19:53:34 crc kubenswrapper[4835]: I1003 19:53:34.473366 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f284b" event={"ID":"314c5c33-739b-4996-b64d-d21be1fb411e","Type":"ContainerStarted","Data":"64bd7783c1bc013ac47c0a2fe979fa1e9b1c93bb5d901d5dcc194b14d43703aa"} Oct 03 19:53:34 crc kubenswrapper[4835]: I1003 19:53:34.476723 4835 generic.go:334] "Generic (PLEG): container finished" podID="7dadd9e7-0f46-4bb5-aae4-46c518e64ced" containerID="7ed23fdba11117a0a022a5119099052b50d4d5ee000cb53012f6341e091a0592" exitCode=0 Oct 03 19:53:34 crc kubenswrapper[4835]: I1003 19:53:34.476963 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhghq" event={"ID":"7dadd9e7-0f46-4bb5-aae4-46c518e64ced","Type":"ContainerDied","Data":"7ed23fdba11117a0a022a5119099052b50d4d5ee000cb53012f6341e091a0592"} Oct 03 19:53:34 crc kubenswrapper[4835]: I1003 19:53:34.580847 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/crc-debug-xhrbb" Oct 03 19:53:34 crc kubenswrapper[4835]: I1003 19:53:34.672245 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3469d761-3379-4702-87e9-ac02f5d19273-host\") pod \"3469d761-3379-4702-87e9-ac02f5d19273\" (UID: \"3469d761-3379-4702-87e9-ac02f5d19273\") " Oct 03 19:53:34 crc kubenswrapper[4835]: I1003 19:53:34.672318 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69h76\" (UniqueName: \"kubernetes.io/projected/3469d761-3379-4702-87e9-ac02f5d19273-kube-api-access-69h76\") pod \"3469d761-3379-4702-87e9-ac02f5d19273\" (UID: \"3469d761-3379-4702-87e9-ac02f5d19273\") " Oct 03 19:53:34 crc kubenswrapper[4835]: I1003 19:53:34.672334 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3469d761-3379-4702-87e9-ac02f5d19273-host" (OuterVolumeSpecName: "host") pod "3469d761-3379-4702-87e9-ac02f5d19273" (UID: "3469d761-3379-4702-87e9-ac02f5d19273"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 19:53:34 crc kubenswrapper[4835]: I1003 19:53:34.673183 4835 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3469d761-3379-4702-87e9-ac02f5d19273-host\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:34 crc kubenswrapper[4835]: I1003 19:53:34.679895 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3469d761-3379-4702-87e9-ac02f5d19273-kube-api-access-69h76" (OuterVolumeSpecName: "kube-api-access-69h76") pod "3469d761-3379-4702-87e9-ac02f5d19273" (UID: "3469d761-3379-4702-87e9-ac02f5d19273"). InnerVolumeSpecName "kube-api-access-69h76". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:53:34 crc kubenswrapper[4835]: I1003 19:53:34.775048 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69h76\" (UniqueName: \"kubernetes.io/projected/3469d761-3379-4702-87e9-ac02f5d19273-kube-api-access-69h76\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:35 crc kubenswrapper[4835]: I1003 19:53:35.499085 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhghq" event={"ID":"7dadd9e7-0f46-4bb5-aae4-46c518e64ced","Type":"ContainerStarted","Data":"48b1ec923f9e4474e3f25dfe57646e51859d202f5fdb7a0f70a74b19c1df4421"} Oct 03 19:53:35 crc kubenswrapper[4835]: I1003 19:53:35.507226 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j7c72/crc-debug-xhrbb" event={"ID":"3469d761-3379-4702-87e9-ac02f5d19273","Type":"ContainerDied","Data":"2eab30d53887efef4a8b97b63db784fc84d235314c91e3cd3ad7bdf41a4dadb0"} Oct 03 19:53:35 crc kubenswrapper[4835]: I1003 19:53:35.507706 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eab30d53887efef4a8b97b63db784fc84d235314c91e3cd3ad7bdf41a4dadb0" Oct 03 19:53:35 crc kubenswrapper[4835]: I1003 19:53:35.507720 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/crc-debug-xhrbb" Oct 03 19:53:35 crc kubenswrapper[4835]: I1003 19:53:35.519561 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jhghq" podStartSLOduration=2.897461169 podStartE2EDuration="5.519543213s" podCreationTimestamp="2025-10-03 19:53:30 +0000 UTC" firstStartedPulling="2025-10-03 19:53:32.443455403 +0000 UTC m=+5954.159396275" lastFinishedPulling="2025-10-03 19:53:35.065537457 +0000 UTC m=+5956.781478319" observedRunningTime="2025-10-03 19:53:35.51780487 +0000 UTC m=+5957.233745742" watchObservedRunningTime="2025-10-03 19:53:35.519543213 +0000 UTC m=+5957.235484085" Oct 03 19:53:36 crc kubenswrapper[4835]: I1003 19:53:36.533236 4835 generic.go:334] "Generic (PLEG): container finished" podID="314c5c33-739b-4996-b64d-d21be1fb411e" containerID="02f7760cb0145c8aef5913c0fed0f5692a20137426080f325f137c398b79d43a" exitCode=0 Oct 03 19:53:36 crc kubenswrapper[4835]: I1003 19:53:36.539362 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f284b" event={"ID":"314c5c33-739b-4996-b64d-d21be1fb411e","Type":"ContainerDied","Data":"02f7760cb0145c8aef5913c0fed0f5692a20137426080f325f137c398b79d43a"} Oct 03 19:53:37 crc kubenswrapper[4835]: I1003 19:53:37.553189 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f284b" event={"ID":"314c5c33-739b-4996-b64d-d21be1fb411e","Type":"ContainerStarted","Data":"b23190634f4df80b6834fbedb03197d972484960e2284649d78207af4f29c3f8"} Oct 03 19:53:37 crc kubenswrapper[4835]: I1003 19:53:37.581542 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f284b" podStartSLOduration=3.07337206 podStartE2EDuration="5.581519803s" podCreationTimestamp="2025-10-03 19:53:32 +0000 UTC" firstStartedPulling="2025-10-03 19:53:34.473943289 +0000 UTC m=+5956.189884161" lastFinishedPulling="2025-10-03 19:53:36.982091012 +0000 UTC m=+5958.698031904" observedRunningTime="2025-10-03 19:53:37.575558727 +0000 UTC m=+5959.291499599" watchObservedRunningTime="2025-10-03 19:53:37.581519803 +0000 UTC m=+5959.297460665" Oct 03 19:53:38 crc kubenswrapper[4835]: I1003 19:53:38.296986 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:38 crc kubenswrapper[4835]: I1003 19:53:38.297030 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:38 crc kubenswrapper[4835]: I1003 19:53:38.358769 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:38 crc kubenswrapper[4835]: I1003 19:53:38.623743 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:41 crc kubenswrapper[4835]: I1003 19:53:41.008151 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:41 crc kubenswrapper[4835]: I1003 19:53:41.008558 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:41 crc kubenswrapper[4835]: I1003 19:53:41.077065 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:41 crc kubenswrapper[4835]: I1003 19:53:41.310620 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zzv22"] Oct 03 19:53:41 crc kubenswrapper[4835]: I1003 19:53:41.310871 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zzv22" podUID="273bae10-cb88-49e6-8754-c4828787f3fd" containerName="registry-server" containerID="cri-o://e3c82bd2fd0263294a6a972c920df44c752736c2ac9149b33d0e60a81b5dec16" gracePeriod=2 Oct 03 19:53:41 crc kubenswrapper[4835]: I1003 19:53:41.608319 4835 generic.go:334] "Generic (PLEG): container finished" podID="273bae10-cb88-49e6-8754-c4828787f3fd" containerID="e3c82bd2fd0263294a6a972c920df44c752736c2ac9149b33d0e60a81b5dec16" exitCode=0 Oct 03 19:53:41 crc kubenswrapper[4835]: I1003 19:53:41.610101 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzv22" event={"ID":"273bae10-cb88-49e6-8754-c4828787f3fd","Type":"ContainerDied","Data":"e3c82bd2fd0263294a6a972c920df44c752736c2ac9149b33d0e60a81b5dec16"} Oct 03 19:53:41 crc kubenswrapper[4835]: I1003 19:53:41.700813 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:41 crc kubenswrapper[4835]: I1003 19:53:41.954911 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.083742 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmzjl\" (UniqueName: \"kubernetes.io/projected/273bae10-cb88-49e6-8754-c4828787f3fd-kube-api-access-fmzjl\") pod \"273bae10-cb88-49e6-8754-c4828787f3fd\" (UID: \"273bae10-cb88-49e6-8754-c4828787f3fd\") " Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.083932 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/273bae10-cb88-49e6-8754-c4828787f3fd-catalog-content\") pod \"273bae10-cb88-49e6-8754-c4828787f3fd\" (UID: \"273bae10-cb88-49e6-8754-c4828787f3fd\") " Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.086960 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/273bae10-cb88-49e6-8754-c4828787f3fd-utilities\") pod \"273bae10-cb88-49e6-8754-c4828787f3fd\" (UID: \"273bae10-cb88-49e6-8754-c4828787f3fd\") " Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.087887 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/273bae10-cb88-49e6-8754-c4828787f3fd-utilities" (OuterVolumeSpecName: "utilities") pod "273bae10-cb88-49e6-8754-c4828787f3fd" (UID: "273bae10-cb88-49e6-8754-c4828787f3fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.090489 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/273bae10-cb88-49e6-8754-c4828787f3fd-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.092688 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/273bae10-cb88-49e6-8754-c4828787f3fd-kube-api-access-fmzjl" (OuterVolumeSpecName: "kube-api-access-fmzjl") pod "273bae10-cb88-49e6-8754-c4828787f3fd" (UID: "273bae10-cb88-49e6-8754-c4828787f3fd"). InnerVolumeSpecName "kube-api-access-fmzjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.144777 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/273bae10-cb88-49e6-8754-c4828787f3fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "273bae10-cb88-49e6-8754-c4828787f3fd" (UID: "273bae10-cb88-49e6-8754-c4828787f3fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.192226 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmzjl\" (UniqueName: \"kubernetes.io/projected/273bae10-cb88-49e6-8754-c4828787f3fd-kube-api-access-fmzjl\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.192268 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/273bae10-cb88-49e6-8754-c4828787f3fd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.623377 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzv22" Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.623558 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzv22" event={"ID":"273bae10-cb88-49e6-8754-c4828787f3fd","Type":"ContainerDied","Data":"1e330d59df72407c17858d368c69fa432ebd769a683a6222999b75cbfa770587"} Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.623624 4835 scope.go:117] "RemoveContainer" containerID="e3c82bd2fd0263294a6a972c920df44c752736c2ac9149b33d0e60a81b5dec16" Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.649285 4835 scope.go:117] "RemoveContainer" containerID="38656510da19f882a460fd7b1331f44b9b69cf084b97160ee599514d58fc10ac" Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.674061 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zzv22"] Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.686662 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zzv22"] Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.707452 4835 scope.go:117] "RemoveContainer" containerID="ef7b6f2afb6f860eaf7f849ea577607fcc7a3ada1ca636ba44f0ddc3373dcdd1" Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.718138 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhghq"] Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.880466 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:53:42 crc kubenswrapper[4835]: E1003 19:53:42.881181 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:53:42 crc kubenswrapper[4835]: I1003 19:53:42.895616 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="273bae10-cb88-49e6-8754-c4828787f3fd" path="/var/lib/kubelet/pods/273bae10-cb88-49e6-8754-c4828787f3fd/volumes" Oct 03 19:53:43 crc kubenswrapper[4835]: I1003 19:53:43.064401 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:43 crc kubenswrapper[4835]: I1003 19:53:43.064462 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:43 crc kubenswrapper[4835]: I1003 19:53:43.136924 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:43 crc kubenswrapper[4835]: I1003 19:53:43.637707 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jhghq" podUID="7dadd9e7-0f46-4bb5-aae4-46c518e64ced" containerName="registry-server" containerID="cri-o://48b1ec923f9e4474e3f25dfe57646e51859d202f5fdb7a0f70a74b19c1df4421" gracePeriod=2 Oct 03 19:53:43 crc kubenswrapper[4835]: I1003 19:53:43.701049 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:43 crc kubenswrapper[4835]: I1003 19:53:43.717877 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j7c72/crc-debug-xhrbb"] Oct 03 19:53:43 crc kubenswrapper[4835]: I1003 19:53:43.726928 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j7c72/crc-debug-xhrbb"] Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.653014 4835 generic.go:334] "Generic (PLEG): container finished" podID="7dadd9e7-0f46-4bb5-aae4-46c518e64ced" containerID="48b1ec923f9e4474e3f25dfe57646e51859d202f5fdb7a0f70a74b19c1df4421" exitCode=0 Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.653165 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhghq" event={"ID":"7dadd9e7-0f46-4bb5-aae4-46c518e64ced","Type":"ContainerDied","Data":"48b1ec923f9e4474e3f25dfe57646e51859d202f5fdb7a0f70a74b19c1df4421"} Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.790448 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.866955 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-utilities\") pod \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\" (UID: \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\") " Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.867101 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-catalog-content\") pod \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\" (UID: \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\") " Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.867337 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f9np\" (UniqueName: \"kubernetes.io/projected/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-kube-api-access-7f9np\") pod \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\" (UID: \"7dadd9e7-0f46-4bb5-aae4-46c518e64ced\") " Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.867962 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-utilities" (OuterVolumeSpecName: "utilities") pod "7dadd9e7-0f46-4bb5-aae4-46c518e64ced" (UID: "7dadd9e7-0f46-4bb5-aae4-46c518e64ced"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.880116 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-kube-api-access-7f9np" (OuterVolumeSpecName: "kube-api-access-7f9np") pod "7dadd9e7-0f46-4bb5-aae4-46c518e64ced" (UID: "7dadd9e7-0f46-4bb5-aae4-46c518e64ced"). InnerVolumeSpecName "kube-api-access-7f9np". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.890616 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3469d761-3379-4702-87e9-ac02f5d19273" path="/var/lib/kubelet/pods/3469d761-3379-4702-87e9-ac02f5d19273/volumes" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.892851 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7dadd9e7-0f46-4bb5-aae4-46c518e64ced" (UID: "7dadd9e7-0f46-4bb5-aae4-46c518e64ced"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.971222 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.971266 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f9np\" (UniqueName: \"kubernetes.io/projected/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-kube-api-access-7f9np\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.971278 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dadd9e7-0f46-4bb5-aae4-46c518e64ced-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.972569 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-j7c72/crc-debug-rhx64"] Oct 03 19:53:44 crc kubenswrapper[4835]: E1003 19:53:44.973098 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273bae10-cb88-49e6-8754-c4828787f3fd" containerName="registry-server" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.973118 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="273bae10-cb88-49e6-8754-c4828787f3fd" containerName="registry-server" Oct 03 19:53:44 crc kubenswrapper[4835]: E1003 19:53:44.973135 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273bae10-cb88-49e6-8754-c4828787f3fd" containerName="extract-utilities" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.973142 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="273bae10-cb88-49e6-8754-c4828787f3fd" containerName="extract-utilities" Oct 03 19:53:44 crc kubenswrapper[4835]: E1003 19:53:44.973167 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dadd9e7-0f46-4bb5-aae4-46c518e64ced" containerName="registry-server" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.973173 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dadd9e7-0f46-4bb5-aae4-46c518e64ced" containerName="registry-server" Oct 03 19:53:44 crc kubenswrapper[4835]: E1003 19:53:44.973188 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dadd9e7-0f46-4bb5-aae4-46c518e64ced" containerName="extract-utilities" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.973193 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dadd9e7-0f46-4bb5-aae4-46c518e64ced" containerName="extract-utilities" Oct 03 19:53:44 crc kubenswrapper[4835]: E1003 19:53:44.973204 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dadd9e7-0f46-4bb5-aae4-46c518e64ced" containerName="extract-content" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.973210 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dadd9e7-0f46-4bb5-aae4-46c518e64ced" containerName="extract-content" Oct 03 19:53:44 crc kubenswrapper[4835]: E1003 19:53:44.973226 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3469d761-3379-4702-87e9-ac02f5d19273" containerName="container-00" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.973234 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3469d761-3379-4702-87e9-ac02f5d19273" containerName="container-00" Oct 03 19:53:44 crc kubenswrapper[4835]: E1003 19:53:44.973250 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273bae10-cb88-49e6-8754-c4828787f3fd" containerName="extract-content" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.973256 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="273bae10-cb88-49e6-8754-c4828787f3fd" containerName="extract-content" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.973457 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dadd9e7-0f46-4bb5-aae4-46c518e64ced" containerName="registry-server" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.973480 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="273bae10-cb88-49e6-8754-c4828787f3fd" containerName="registry-server" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.973495 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3469d761-3379-4702-87e9-ac02f5d19273" containerName="container-00" Oct 03 19:53:44 crc kubenswrapper[4835]: I1003 19:53:44.974372 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/crc-debug-rhx64" Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.074278 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be514a5b-7674-404e-ad18-2cd515cc6c57-host\") pod \"crc-debug-rhx64\" (UID: \"be514a5b-7674-404e-ad18-2cd515cc6c57\") " pod="openshift-must-gather-j7c72/crc-debug-rhx64" Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.074402 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5srs\" (UniqueName: \"kubernetes.io/projected/be514a5b-7674-404e-ad18-2cd515cc6c57-kube-api-access-q5srs\") pod \"crc-debug-rhx64\" (UID: \"be514a5b-7674-404e-ad18-2cd515cc6c57\") " pod="openshift-must-gather-j7c72/crc-debug-rhx64" Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.176811 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be514a5b-7674-404e-ad18-2cd515cc6c57-host\") pod \"crc-debug-rhx64\" (UID: \"be514a5b-7674-404e-ad18-2cd515cc6c57\") " pod="openshift-must-gather-j7c72/crc-debug-rhx64" Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.176916 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5srs\" (UniqueName: \"kubernetes.io/projected/be514a5b-7674-404e-ad18-2cd515cc6c57-kube-api-access-q5srs\") pod \"crc-debug-rhx64\" (UID: \"be514a5b-7674-404e-ad18-2cd515cc6c57\") " pod="openshift-must-gather-j7c72/crc-debug-rhx64" Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.176962 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be514a5b-7674-404e-ad18-2cd515cc6c57-host\") pod \"crc-debug-rhx64\" (UID: \"be514a5b-7674-404e-ad18-2cd515cc6c57\") " pod="openshift-must-gather-j7c72/crc-debug-rhx64" Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.198927 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5srs\" (UniqueName: \"kubernetes.io/projected/be514a5b-7674-404e-ad18-2cd515cc6c57-kube-api-access-q5srs\") pod \"crc-debug-rhx64\" (UID: \"be514a5b-7674-404e-ad18-2cd515cc6c57\") " pod="openshift-must-gather-j7c72/crc-debug-rhx64" Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.297010 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/crc-debug-rhx64" Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.677664 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhghq" event={"ID":"7dadd9e7-0f46-4bb5-aae4-46c518e64ced","Type":"ContainerDied","Data":"92af036c4468758e507fcfb73b1c2ae7939b930810e0bf474339e56cbc8d3c46"} Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.677736 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhghq" Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.678210 4835 scope.go:117] "RemoveContainer" containerID="48b1ec923f9e4474e3f25dfe57646e51859d202f5fdb7a0f70a74b19c1df4421" Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.682916 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j7c72/crc-debug-rhx64" event={"ID":"be514a5b-7674-404e-ad18-2cd515cc6c57","Type":"ContainerStarted","Data":"57650d8c39b3da59b6ff37cd1ee6c20f785e8e43d91ec60e7de4bc5861248e1d"} Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.683028 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j7c72/crc-debug-rhx64" event={"ID":"be514a5b-7674-404e-ad18-2cd515cc6c57","Type":"ContainerStarted","Data":"4a324611c5a332820e29351c6bb63620c05d24ac2040a057fa4d20bad7cbea81"} Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.756651 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j7c72/crc-debug-rhx64"] Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.758350 4835 scope.go:117] "RemoveContainer" containerID="7ed23fdba11117a0a022a5119099052b50d4d5ee000cb53012f6341e091a0592" Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.771904 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j7c72/crc-debug-rhx64"] Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.784371 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhghq"] Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.786366 4835 scope.go:117] "RemoveContainer" containerID="696b3a401625a497808101483e6c024282fdd556d9827bd1de82e05fa526db9c" Oct 03 19:53:45 crc kubenswrapper[4835]: I1003 19:53:45.800106 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhghq"] Oct 03 19:53:46 crc kubenswrapper[4835]: I1003 19:53:46.699428 4835 generic.go:334] "Generic (PLEG): container finished" podID="be514a5b-7674-404e-ad18-2cd515cc6c57" containerID="57650d8c39b3da59b6ff37cd1ee6c20f785e8e43d91ec60e7de4bc5861248e1d" exitCode=0 Oct 03 19:53:46 crc kubenswrapper[4835]: I1003 19:53:46.839570 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/crc-debug-rhx64" Oct 03 19:53:46 crc kubenswrapper[4835]: I1003 19:53:46.901535 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dadd9e7-0f46-4bb5-aae4-46c518e64ced" path="/var/lib/kubelet/pods/7dadd9e7-0f46-4bb5-aae4-46c518e64ced/volumes" Oct 03 19:53:46 crc kubenswrapper[4835]: I1003 19:53:46.922180 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be514a5b-7674-404e-ad18-2cd515cc6c57-host\") pod \"be514a5b-7674-404e-ad18-2cd515cc6c57\" (UID: \"be514a5b-7674-404e-ad18-2cd515cc6c57\") " Oct 03 19:53:46 crc kubenswrapper[4835]: I1003 19:53:46.922310 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be514a5b-7674-404e-ad18-2cd515cc6c57-host" (OuterVolumeSpecName: "host") pod "be514a5b-7674-404e-ad18-2cd515cc6c57" (UID: "be514a5b-7674-404e-ad18-2cd515cc6c57"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 19:53:46 crc kubenswrapper[4835]: I1003 19:53:46.922411 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5srs\" (UniqueName: \"kubernetes.io/projected/be514a5b-7674-404e-ad18-2cd515cc6c57-kube-api-access-q5srs\") pod \"be514a5b-7674-404e-ad18-2cd515cc6c57\" (UID: \"be514a5b-7674-404e-ad18-2cd515cc6c57\") " Oct 03 19:53:46 crc kubenswrapper[4835]: I1003 19:53:46.923142 4835 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be514a5b-7674-404e-ad18-2cd515cc6c57-host\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:46 crc kubenswrapper[4835]: I1003 19:53:46.930976 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be514a5b-7674-404e-ad18-2cd515cc6c57-kube-api-access-q5srs" (OuterVolumeSpecName: "kube-api-access-q5srs") pod "be514a5b-7674-404e-ad18-2cd515cc6c57" (UID: "be514a5b-7674-404e-ad18-2cd515cc6c57"). InnerVolumeSpecName "kube-api-access-q5srs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.025990 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5srs\" (UniqueName: \"kubernetes.io/projected/be514a5b-7674-404e-ad18-2cd515cc6c57-kube-api-access-q5srs\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.110231 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f284b"] Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.110547 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f284b" podUID="314c5c33-739b-4996-b64d-d21be1fb411e" containerName="registry-server" containerID="cri-o://b23190634f4df80b6834fbedb03197d972484960e2284649d78207af4f29c3f8" gracePeriod=2 Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.610614 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.680651 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt_a719243c-35c2-4ecf-89ff-7843179f36de/util/0.log" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.725055 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/crc-debug-rhx64" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.725192 4835 scope.go:117] "RemoveContainer" containerID="57650d8c39b3da59b6ff37cd1ee6c20f785e8e43d91ec60e7de4bc5861248e1d" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.740048 4835 generic.go:334] "Generic (PLEG): container finished" podID="314c5c33-739b-4996-b64d-d21be1fb411e" containerID="b23190634f4df80b6834fbedb03197d972484960e2284649d78207af4f29c3f8" exitCode=0 Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.740129 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f284b" event={"ID":"314c5c33-739b-4996-b64d-d21be1fb411e","Type":"ContainerDied","Data":"b23190634f4df80b6834fbedb03197d972484960e2284649d78207af4f29c3f8"} Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.740162 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f284b" event={"ID":"314c5c33-739b-4996-b64d-d21be1fb411e","Type":"ContainerDied","Data":"64bd7783c1bc013ac47c0a2fe979fa1e9b1c93bb5d901d5dcc194b14d43703aa"} Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.740234 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f284b" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.742793 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vftkv\" (UniqueName: \"kubernetes.io/projected/314c5c33-739b-4996-b64d-d21be1fb411e-kube-api-access-vftkv\") pod \"314c5c33-739b-4996-b64d-d21be1fb411e\" (UID: \"314c5c33-739b-4996-b64d-d21be1fb411e\") " Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.743128 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/314c5c33-739b-4996-b64d-d21be1fb411e-catalog-content\") pod \"314c5c33-739b-4996-b64d-d21be1fb411e\" (UID: \"314c5c33-739b-4996-b64d-d21be1fb411e\") " Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.743166 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/314c5c33-739b-4996-b64d-d21be1fb411e-utilities\") pod \"314c5c33-739b-4996-b64d-d21be1fb411e\" (UID: \"314c5c33-739b-4996-b64d-d21be1fb411e\") " Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.744473 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/314c5c33-739b-4996-b64d-d21be1fb411e-utilities" (OuterVolumeSpecName: "utilities") pod "314c5c33-739b-4996-b64d-d21be1fb411e" (UID: "314c5c33-739b-4996-b64d-d21be1fb411e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.750423 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314c5c33-739b-4996-b64d-d21be1fb411e-kube-api-access-vftkv" (OuterVolumeSpecName: "kube-api-access-vftkv") pod "314c5c33-739b-4996-b64d-d21be1fb411e" (UID: "314c5c33-739b-4996-b64d-d21be1fb411e"). InnerVolumeSpecName "kube-api-access-vftkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.770400 4835 scope.go:117] "RemoveContainer" containerID="b23190634f4df80b6834fbedb03197d972484960e2284649d78207af4f29c3f8" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.838965 4835 scope.go:117] "RemoveContainer" containerID="02f7760cb0145c8aef5913c0fed0f5692a20137426080f325f137c398b79d43a" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.849019 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vftkv\" (UniqueName: \"kubernetes.io/projected/314c5c33-739b-4996-b64d-d21be1fb411e-kube-api-access-vftkv\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.849059 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/314c5c33-739b-4996-b64d-d21be1fb411e-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.867303 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/314c5c33-739b-4996-b64d-d21be1fb411e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "314c5c33-739b-4996-b64d-d21be1fb411e" (UID: "314c5c33-739b-4996-b64d-d21be1fb411e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.868515 4835 scope.go:117] "RemoveContainer" containerID="ed4419e8677457ed3f953c4ea73fa6b89dfb89552b4ed2b3ccff373f5d763cb7" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.902662 4835 scope.go:117] "RemoveContainer" containerID="b23190634f4df80b6834fbedb03197d972484960e2284649d78207af4f29c3f8" Oct 03 19:53:47 crc kubenswrapper[4835]: E1003 19:53:47.903242 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b23190634f4df80b6834fbedb03197d972484960e2284649d78207af4f29c3f8\": container with ID starting with b23190634f4df80b6834fbedb03197d972484960e2284649d78207af4f29c3f8 not found: ID does not exist" containerID="b23190634f4df80b6834fbedb03197d972484960e2284649d78207af4f29c3f8" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.903287 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b23190634f4df80b6834fbedb03197d972484960e2284649d78207af4f29c3f8"} err="failed to get container status \"b23190634f4df80b6834fbedb03197d972484960e2284649d78207af4f29c3f8\": rpc error: code = NotFound desc = could not find container \"b23190634f4df80b6834fbedb03197d972484960e2284649d78207af4f29c3f8\": container with ID starting with b23190634f4df80b6834fbedb03197d972484960e2284649d78207af4f29c3f8 not found: ID does not exist" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.903317 4835 scope.go:117] "RemoveContainer" containerID="02f7760cb0145c8aef5913c0fed0f5692a20137426080f325f137c398b79d43a" Oct 03 19:53:47 crc kubenswrapper[4835]: E1003 19:53:47.903662 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02f7760cb0145c8aef5913c0fed0f5692a20137426080f325f137c398b79d43a\": container with ID starting with 02f7760cb0145c8aef5913c0fed0f5692a20137426080f325f137c398b79d43a not found: ID does not exist" containerID="02f7760cb0145c8aef5913c0fed0f5692a20137426080f325f137c398b79d43a" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.903718 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02f7760cb0145c8aef5913c0fed0f5692a20137426080f325f137c398b79d43a"} err="failed to get container status \"02f7760cb0145c8aef5913c0fed0f5692a20137426080f325f137c398b79d43a\": rpc error: code = NotFound desc = could not find container \"02f7760cb0145c8aef5913c0fed0f5692a20137426080f325f137c398b79d43a\": container with ID starting with 02f7760cb0145c8aef5913c0fed0f5692a20137426080f325f137c398b79d43a not found: ID does not exist" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.903756 4835 scope.go:117] "RemoveContainer" containerID="ed4419e8677457ed3f953c4ea73fa6b89dfb89552b4ed2b3ccff373f5d763cb7" Oct 03 19:53:47 crc kubenswrapper[4835]: E1003 19:53:47.907991 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed4419e8677457ed3f953c4ea73fa6b89dfb89552b4ed2b3ccff373f5d763cb7\": container with ID starting with ed4419e8677457ed3f953c4ea73fa6b89dfb89552b4ed2b3ccff373f5d763cb7 not found: ID does not exist" containerID="ed4419e8677457ed3f953c4ea73fa6b89dfb89552b4ed2b3ccff373f5d763cb7" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.908034 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4419e8677457ed3f953c4ea73fa6b89dfb89552b4ed2b3ccff373f5d763cb7"} err="failed to get container status \"ed4419e8677457ed3f953c4ea73fa6b89dfb89552b4ed2b3ccff373f5d763cb7\": rpc error: code = NotFound desc = could not find container \"ed4419e8677457ed3f953c4ea73fa6b89dfb89552b4ed2b3ccff373f5d763cb7\": container with ID starting with ed4419e8677457ed3f953c4ea73fa6b89dfb89552b4ed2b3ccff373f5d763cb7 not found: ID does not exist" Oct 03 19:53:47 crc kubenswrapper[4835]: I1003 19:53:47.951465 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/314c5c33-739b-4996-b64d-d21be1fb411e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.009099 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt_a719243c-35c2-4ecf-89ff-7843179f36de/pull/0.log" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.015724 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt_a719243c-35c2-4ecf-89ff-7843179f36de/pull/0.log" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.017864 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt_a719243c-35c2-4ecf-89ff-7843179f36de/util/0.log" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.079172 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f284b"] Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.088936 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f284b"] Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.220687 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt_a719243c-35c2-4ecf-89ff-7843179f36de/util/0.log" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.246649 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt_a719243c-35c2-4ecf-89ff-7843179f36de/pull/0.log" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.301606 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_76687f1865e943b5e733377159d20f47def6a1e8e38de9d5e3bf0b39a3m9jpt_a719243c-35c2-4ecf-89ff-7843179f36de/extract/0.log" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.428997 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-8b5bp_0d92cdf1-cce4-485e-9ed5-2539600d7e36/kube-rbac-proxy/0.log" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.531936 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-8b5bp_0d92cdf1-cce4-485e-9ed5-2539600d7e36/manager/0.log" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.540564 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-h6txd_52d602a7-7c52-410b-b0d3-7a2233258474/kube-rbac-proxy/0.log" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.757803 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-2w59k_f7af64e9-1970-4672-8564-ba96ab371353/kube-rbac-proxy/0.log" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.764527 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-h6txd_52d602a7-7c52-410b-b0d3-7a2233258474/manager/0.log" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.813849 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-2w59k_f7af64e9-1970-4672-8564-ba96ab371353/manager/0.log" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.890362 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="314c5c33-739b-4996-b64d-d21be1fb411e" path="/var/lib/kubelet/pods/314c5c33-739b-4996-b64d-d21be1fb411e/volumes" Oct 03 19:53:48 crc kubenswrapper[4835]: I1003 19:53:48.891272 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be514a5b-7674-404e-ad18-2cd515cc6c57" path="/var/lib/kubelet/pods/be514a5b-7674-404e-ad18-2cd515cc6c57/volumes" Oct 03 19:53:49 crc kubenswrapper[4835]: I1003 19:53:49.000530 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-5j994_6b594ae9-45e5-4ae6-b59e-8e1e44a182db/kube-rbac-proxy/0.log" Oct 03 19:53:49 crc kubenswrapper[4835]: I1003 19:53:49.082228 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-5j994_6b594ae9-45e5-4ae6-b59e-8e1e44a182db/manager/0.log" Oct 03 19:53:49 crc kubenswrapper[4835]: I1003 19:53:49.207220 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-lpmkz_9a423f22-efb7-4413-b0ba-886fb392aa5c/manager/0.log" Oct 03 19:53:49 crc kubenswrapper[4835]: I1003 19:53:49.222555 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-lpmkz_9a423f22-efb7-4413-b0ba-886fb392aa5c/kube-rbac-proxy/0.log" Oct 03 19:53:49 crc kubenswrapper[4835]: I1003 19:53:49.281295 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-dnbkj_75049f02-3a34-4376-b2b2-1c447894b16c/kube-rbac-proxy/0.log" Oct 03 19:53:49 crc kubenswrapper[4835]: I1003 19:53:49.454414 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-dnbkj_75049f02-3a34-4376-b2b2-1c447894b16c/manager/0.log" Oct 03 19:53:49 crc kubenswrapper[4835]: I1003 19:53:49.536551 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-5hv6c_050d6159-92eb-4f65-8c33-dce9d2cac262/kube-rbac-proxy/0.log" Oct 03 19:53:49 crc kubenswrapper[4835]: I1003 19:53:49.745636 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-9x4rx_967b019a-36f2-4dda-8d6f-968cfb65f954/kube-rbac-proxy/0.log" Oct 03 19:53:49 crc kubenswrapper[4835]: I1003 19:53:49.770198 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-5hv6c_050d6159-92eb-4f65-8c33-dce9d2cac262/manager/0.log" Oct 03 19:53:49 crc kubenswrapper[4835]: I1003 19:53:49.837402 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-9x4rx_967b019a-36f2-4dda-8d6f-968cfb65f954/manager/0.log" Oct 03 19:53:49 crc kubenswrapper[4835]: I1003 19:53:49.993887 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-6dbbn_3d57a2da-d32b-4fea-8044-da2ab34d87d5/kube-rbac-proxy/0.log" Oct 03 19:53:50 crc kubenswrapper[4835]: I1003 19:53:50.063912 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-6dbbn_3d57a2da-d32b-4fea-8044-da2ab34d87d5/manager/0.log" Oct 03 19:53:50 crc kubenswrapper[4835]: I1003 19:53:50.227714 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-bxwnx_c248086e-e0fe-47f0-972e-8378189c018f/kube-rbac-proxy/0.log" Oct 03 19:53:50 crc kubenswrapper[4835]: I1003 19:53:50.262644 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-bxwnx_c248086e-e0fe-47f0-972e-8378189c018f/manager/0.log" Oct 03 19:53:50 crc kubenswrapper[4835]: I1003 19:53:50.293916 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-g5nh8_822f37e4-b1f2-40ed-a075-cdff171e42bd/kube-rbac-proxy/0.log" Oct 03 19:53:50 crc kubenswrapper[4835]: I1003 19:53:50.456597 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-g5nh8_822f37e4-b1f2-40ed-a075-cdff171e42bd/manager/0.log" Oct 03 19:53:50 crc kubenswrapper[4835]: I1003 19:53:50.546507 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-rrqt8_f2b0de06-963e-4fd4-8056-3272f20f3ef8/kube-rbac-proxy/0.log" Oct 03 19:53:50 crc kubenswrapper[4835]: I1003 19:53:50.636213 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-rrqt8_f2b0de06-963e-4fd4-8056-3272f20f3ef8/manager/0.log" Oct 03 19:53:50 crc kubenswrapper[4835]: I1003 19:53:50.761939 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-xrb5b_85ad7a18-4a2c-4225-972b-5e12be17aee0/kube-rbac-proxy/0.log" Oct 03 19:53:50 crc kubenswrapper[4835]: I1003 19:53:50.849497 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-xrb5b_85ad7a18-4a2c-4225-972b-5e12be17aee0/manager/0.log" Oct 03 19:53:50 crc kubenswrapper[4835]: I1003 19:53:50.973107 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-dmgs4_54c68b19-b66f-47af-808e-e3708ee36642/kube-rbac-proxy/0.log" Oct 03 19:53:51 crc kubenswrapper[4835]: I1003 19:53:51.002022 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-dmgs4_54c68b19-b66f-47af-808e-e3708ee36642/manager/0.log" Oct 03 19:53:51 crc kubenswrapper[4835]: I1003 19:53:51.092052 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t_e96c6565-8432-4b47-bc1a-f7510415a0dd/kube-rbac-proxy/0.log" Oct 03 19:53:51 crc kubenswrapper[4835]: I1003 19:53:51.128878 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d6788wn8t_e96c6565-8432-4b47-bc1a-f7510415a0dd/manager/0.log" Oct 03 19:53:51 crc kubenswrapper[4835]: I1003 19:53:51.257297 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b9c56875d-f9xpd_c997309f-800d-4180-aaa9-2594faef74ee/kube-rbac-proxy/0.log" Oct 03 19:53:51 crc kubenswrapper[4835]: I1003 19:53:51.384771 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6479c8db94-bn59w_0369532e-6ba2-4da2-9e1a-c8870d14f001/kube-rbac-proxy/0.log" Oct 03 19:53:51 crc kubenswrapper[4835]: I1003 19:53:51.604370 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6479c8db94-bn59w_0369532e-6ba2-4da2-9e1a-c8870d14f001/operator/0.log" Oct 03 19:53:51 crc kubenswrapper[4835]: I1003 19:53:51.615799 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ct8fh_ef6029b4-ceb8-498c-9925-74d367072557/registry-server/0.log" Oct 03 19:53:51 crc kubenswrapper[4835]: I1003 19:53:51.710963 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-6t4pn_3c7d1a89-7769-4031-abac-edd6b93bdd30/kube-rbac-proxy/0.log" Oct 03 19:53:51 crc kubenswrapper[4835]: I1003 19:53:51.914965 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-6t4pn_3c7d1a89-7769-4031-abac-edd6b93bdd30/manager/0.log" Oct 03 19:53:51 crc kubenswrapper[4835]: I1003 19:53:51.950355 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-xdgkx_11585794-4db5-4e34-aeaa-24036489269b/kube-rbac-proxy/0.log" Oct 03 19:53:52 crc kubenswrapper[4835]: I1003 19:53:52.007305 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-xdgkx_11585794-4db5-4e34-aeaa-24036489269b/manager/0.log" Oct 03 19:53:52 crc kubenswrapper[4835]: I1003 19:53:52.200613 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-5pnzl_afea5a2a-fdd8-47a5-b09e-44f47b2f3f92/operator/0.log" Oct 03 19:53:52 crc kubenswrapper[4835]: I1003 19:53:52.324269 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-fljf9_cebb3329-2036-4540-9444-d47294d13aff/kube-rbac-proxy/0.log" Oct 03 19:53:52 crc kubenswrapper[4835]: I1003 19:53:52.467565 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-bzmxq_ad1db9d5-da6c-4498-8bb4-176c87481a4d/kube-rbac-proxy/0.log" Oct 03 19:53:52 crc kubenswrapper[4835]: I1003 19:53:52.484222 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-fljf9_cebb3329-2036-4540-9444-d47294d13aff/manager/0.log" Oct 03 19:53:52 crc kubenswrapper[4835]: I1003 19:53:52.727593 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-6g488_461cd79d-16ba-4619-b2f0-1e4611a092e1/kube-rbac-proxy/0.log" Oct 03 19:53:52 crc kubenswrapper[4835]: I1003 19:53:52.895686 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-6g488_461cd79d-16ba-4619-b2f0-1e4611a092e1/manager/0.log" Oct 03 19:53:53 crc kubenswrapper[4835]: I1003 19:53:53.040522 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b9c56875d-f9xpd_c997309f-800d-4180-aaa9-2594faef74ee/manager/0.log" Oct 03 19:53:53 crc kubenswrapper[4835]: I1003 19:53:53.043721 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-bzmxq_ad1db9d5-da6c-4498-8bb4-176c87481a4d/manager/0.log" Oct 03 19:53:53 crc kubenswrapper[4835]: I1003 19:53:53.044177 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7d5d9b469c-dh8bv_01389591-f485-4a18-a393-8f5d654ba5e7/kube-rbac-proxy/0.log" Oct 03 19:53:53 crc kubenswrapper[4835]: I1003 19:53:53.197541 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7d5d9b469c-dh8bv_01389591-f485-4a18-a393-8f5d654ba5e7/manager/0.log" Oct 03 19:53:55 crc kubenswrapper[4835]: I1003 19:53:55.877037 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:53:55 crc kubenswrapper[4835]: E1003 19:53:55.877919 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:54:09 crc kubenswrapper[4835]: I1003 19:54:09.877704 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:54:09 crc kubenswrapper[4835]: E1003 19:54:09.878610 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:54:11 crc kubenswrapper[4835]: I1003 19:54:11.077106 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-5rj92_aa5e9c31-c582-444e-97c2-9e285e2b75d4/control-plane-machine-set-operator/0.log" Oct 03 19:54:11 crc kubenswrapper[4835]: I1003 19:54:11.199761 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-v7t7j_76f0bfed-fea0-4d28-bb03-2d3b0ae79d92/kube-rbac-proxy/0.log" Oct 03 19:54:11 crc kubenswrapper[4835]: I1003 19:54:11.293475 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-v7t7j_76f0bfed-fea0-4d28-bb03-2d3b0ae79d92/machine-api-operator/0.log" Oct 03 19:54:23 crc kubenswrapper[4835]: I1003 19:54:23.877719 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:54:23 crc kubenswrapper[4835]: E1003 19:54:23.878768 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:54:25 crc kubenswrapper[4835]: I1003 19:54:25.072862 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-xlhlh_ca95ceab-81bd-4963-80c1-321c5b1c63ef/cert-manager-controller/0.log" Oct 03 19:54:25 crc kubenswrapper[4835]: I1003 19:54:25.267232 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wsd6d_bb91054f-ac67-458b-9c77-5309597b870f/cert-manager-cainjector/0.log" Oct 03 19:54:25 crc kubenswrapper[4835]: I1003 19:54:25.348354 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-hxx5j_3f302462-ea09-488c-93f0-48e3a331b672/cert-manager-webhook/0.log" Oct 03 19:54:38 crc kubenswrapper[4835]: I1003 19:54:38.887759 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:54:38 crc kubenswrapper[4835]: E1003 19:54:38.888796 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:54:39 crc kubenswrapper[4835]: I1003 19:54:39.241639 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-vpn8g_394f1e44-03f0-460e-be0b-a526690916d4/nmstate-console-plugin/0.log" Oct 03 19:54:39 crc kubenswrapper[4835]: I1003 19:54:39.480195 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-q7ccg_efa8995d-566a-4681-8a8d-04c75bb2e5ff/nmstate-handler/0.log" Oct 03 19:54:39 crc kubenswrapper[4835]: I1003 19:54:39.509748 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-4br2m_4e2a2a3c-af08-4f6e-88d7-e42c6327d83a/nmstate-metrics/0.log" Oct 03 19:54:39 crc kubenswrapper[4835]: I1003 19:54:39.562838 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-4br2m_4e2a2a3c-af08-4f6e-88d7-e42c6327d83a/kube-rbac-proxy/0.log" Oct 03 19:54:39 crc kubenswrapper[4835]: I1003 19:54:39.695041 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-nphxp_6278a01c-35aa-4828-a8b8-2bc36e31b756/nmstate-operator/0.log" Oct 03 19:54:39 crc kubenswrapper[4835]: I1003 19:54:39.779427 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-t9j4g_51b41f85-9699-4216-857b-1d79a2cbc755/nmstate-webhook/0.log" Oct 03 19:54:50 crc kubenswrapper[4835]: I1003 19:54:50.880245 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:54:50 crc kubenswrapper[4835]: E1003 19:54:50.881687 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:54:55 crc kubenswrapper[4835]: I1003 19:54:55.962639 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-7lqwn_dc3150d4-d27e-43ed-9659-7034816a3221/kube-rbac-proxy/0.log" Oct 03 19:54:56 crc kubenswrapper[4835]: I1003 19:54:56.130450 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-7lqwn_dc3150d4-d27e-43ed-9659-7034816a3221/controller/0.log" Oct 03 19:54:56 crc kubenswrapper[4835]: I1003 19:54:56.155512 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/cp-frr-files/0.log" Oct 03 19:54:56 crc kubenswrapper[4835]: I1003 19:54:56.392337 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/cp-frr-files/0.log" Oct 03 19:54:56 crc kubenswrapper[4835]: I1003 19:54:56.419856 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/cp-reloader/0.log" Oct 03 19:54:56 crc kubenswrapper[4835]: I1003 19:54:56.429727 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/cp-reloader/0.log" Oct 03 19:54:56 crc kubenswrapper[4835]: I1003 19:54:56.429919 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/cp-metrics/0.log" Oct 03 19:54:56 crc kubenswrapper[4835]: I1003 19:54:56.686874 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/cp-metrics/0.log" Oct 03 19:54:56 crc kubenswrapper[4835]: I1003 19:54:56.698824 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/cp-metrics/0.log" Oct 03 19:54:56 crc kubenswrapper[4835]: I1003 19:54:56.706612 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/cp-frr-files/0.log" Oct 03 19:54:56 crc kubenswrapper[4835]: I1003 19:54:56.706772 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/cp-reloader/0.log" Oct 03 19:54:56 crc kubenswrapper[4835]: I1003 19:54:56.936222 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/cp-reloader/0.log" Oct 03 19:54:56 crc kubenswrapper[4835]: I1003 19:54:56.976842 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/cp-frr-files/0.log" Oct 03 19:54:56 crc kubenswrapper[4835]: I1003 19:54:56.988575 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/cp-metrics/0.log" Oct 03 19:54:57 crc kubenswrapper[4835]: I1003 19:54:57.016348 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/controller/0.log" Oct 03 19:54:57 crc kubenswrapper[4835]: I1003 19:54:57.197423 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/frr-metrics/0.log" Oct 03 19:54:57 crc kubenswrapper[4835]: I1003 19:54:57.212327 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/kube-rbac-proxy/0.log" Oct 03 19:54:57 crc kubenswrapper[4835]: I1003 19:54:57.236388 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/kube-rbac-proxy-frr/0.log" Oct 03 19:54:57 crc kubenswrapper[4835]: I1003 19:54:57.421897 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/reloader/0.log" Oct 03 19:54:57 crc kubenswrapper[4835]: I1003 19:54:57.540706 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-pfgk7_75a37aa1-a5db-4a60-b8c3-c06677284925/frr-k8s-webhook-server/0.log" Oct 03 19:54:57 crc kubenswrapper[4835]: I1003 19:54:57.739780 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-549cbc687f-7s58s_87d605ee-e88a-4b71-9033-029e4ceaf6e5/manager/0.log" Oct 03 19:54:57 crc kubenswrapper[4835]: I1003 19:54:57.913125 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-59cd86bdc9-s28jx_a33193c6-6de6-466f-bc95-046e6d7ae204/webhook-server/0.log" Oct 03 19:54:58 crc kubenswrapper[4835]: I1003 19:54:58.039115 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tcsl9_d15b62f6-9ba0-4ed6-a25d-a7629c881a6d/kube-rbac-proxy/0.log" Oct 03 19:54:58 crc kubenswrapper[4835]: I1003 19:54:58.710668 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tcsl9_d15b62f6-9ba0-4ed6-a25d-a7629c881a6d/speaker/0.log" Oct 03 19:54:59 crc kubenswrapper[4835]: I1003 19:54:59.046007 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h2mqk_0b7c13ae-5fc7-4488-ad26-05f62f390a60/frr/0.log" Oct 03 19:55:02 crc kubenswrapper[4835]: I1003 19:55:02.877374 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:55:02 crc kubenswrapper[4835]: E1003 19:55:02.878577 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:55:13 crc kubenswrapper[4835]: I1003 19:55:13.929828 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p_62075311-2baa-4bed-aaf7-df0d5ac3e3f3/util/0.log" Oct 03 19:55:14 crc kubenswrapper[4835]: I1003 19:55:14.151845 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p_62075311-2baa-4bed-aaf7-df0d5ac3e3f3/pull/0.log" Oct 03 19:55:14 crc kubenswrapper[4835]: I1003 19:55:14.180503 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p_62075311-2baa-4bed-aaf7-df0d5ac3e3f3/util/0.log" Oct 03 19:55:14 crc kubenswrapper[4835]: I1003 19:55:14.194800 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p_62075311-2baa-4bed-aaf7-df0d5ac3e3f3/pull/0.log" Oct 03 19:55:14 crc kubenswrapper[4835]: I1003 19:55:14.411938 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p_62075311-2baa-4bed-aaf7-df0d5ac3e3f3/util/0.log" Oct 03 19:55:14 crc kubenswrapper[4835]: I1003 19:55:14.469271 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p_62075311-2baa-4bed-aaf7-df0d5ac3e3f3/extract/0.log" Oct 03 19:55:14 crc kubenswrapper[4835]: I1003 19:55:14.471561 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2rw74p_62075311-2baa-4bed-aaf7-df0d5ac3e3f3/pull/0.log" Oct 03 19:55:14 crc kubenswrapper[4835]: I1003 19:55:14.627915 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz_368259bf-9e2d-45b2-9c90-98b0f1081180/util/0.log" Oct 03 19:55:14 crc kubenswrapper[4835]: I1003 19:55:14.873992 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz_368259bf-9e2d-45b2-9c90-98b0f1081180/pull/0.log" Oct 03 19:55:14 crc kubenswrapper[4835]: I1003 19:55:14.877394 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz_368259bf-9e2d-45b2-9c90-98b0f1081180/util/0.log" Oct 03 19:55:14 crc kubenswrapper[4835]: I1003 19:55:14.942661 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz_368259bf-9e2d-45b2-9c90-98b0f1081180/pull/0.log" Oct 03 19:55:15 crc kubenswrapper[4835]: I1003 19:55:15.088836 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz_368259bf-9e2d-45b2-9c90-98b0f1081180/pull/0.log" Oct 03 19:55:15 crc kubenswrapper[4835]: I1003 19:55:15.109108 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz_368259bf-9e2d-45b2-9c90-98b0f1081180/extract/0.log" Oct 03 19:55:15 crc kubenswrapper[4835]: I1003 19:55:15.142244 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2df59tz_368259bf-9e2d-45b2-9c90-98b0f1081180/util/0.log" Oct 03 19:55:15 crc kubenswrapper[4835]: I1003 19:55:15.317218 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfd5t_c615419b-72c3-48d9-91b0-918dc3215104/extract-utilities/0.log" Oct 03 19:55:15 crc kubenswrapper[4835]: I1003 19:55:15.563199 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfd5t_c615419b-72c3-48d9-91b0-918dc3215104/extract-utilities/0.log" Oct 03 19:55:15 crc kubenswrapper[4835]: I1003 19:55:15.584922 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfd5t_c615419b-72c3-48d9-91b0-918dc3215104/extract-content/0.log" Oct 03 19:55:15 crc kubenswrapper[4835]: I1003 19:55:15.603190 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfd5t_c615419b-72c3-48d9-91b0-918dc3215104/extract-content/0.log" Oct 03 19:55:15 crc kubenswrapper[4835]: I1003 19:55:15.767199 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfd5t_c615419b-72c3-48d9-91b0-918dc3215104/extract-utilities/0.log" Oct 03 19:55:16 crc kubenswrapper[4835]: I1003 19:55:16.078040 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfd5t_c615419b-72c3-48d9-91b0-918dc3215104/extract-content/0.log" Oct 03 19:55:16 crc kubenswrapper[4835]: I1003 19:55:16.308655 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8p8gm_2af817b8-e0e4-43b7-9000-566f1ef27a80/extract-utilities/0.log" Oct 03 19:55:16 crc kubenswrapper[4835]: I1003 19:55:16.621696 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8p8gm_2af817b8-e0e4-43b7-9000-566f1ef27a80/extract-content/0.log" Oct 03 19:55:16 crc kubenswrapper[4835]: I1003 19:55:16.648549 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8p8gm_2af817b8-e0e4-43b7-9000-566f1ef27a80/extract-content/0.log" Oct 03 19:55:16 crc kubenswrapper[4835]: I1003 19:55:16.652379 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8p8gm_2af817b8-e0e4-43b7-9000-566f1ef27a80/extract-utilities/0.log" Oct 03 19:55:16 crc kubenswrapper[4835]: I1003 19:55:16.845463 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lfd5t_c615419b-72c3-48d9-91b0-918dc3215104/registry-server/0.log" Oct 03 19:55:16 crc kubenswrapper[4835]: I1003 19:55:16.881535 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:55:16 crc kubenswrapper[4835]: E1003 19:55:16.882043 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:55:16 crc kubenswrapper[4835]: I1003 19:55:16.925881 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8p8gm_2af817b8-e0e4-43b7-9000-566f1ef27a80/extract-content/0.log" Oct 03 19:55:16 crc kubenswrapper[4835]: I1003 19:55:16.973630 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8p8gm_2af817b8-e0e4-43b7-9000-566f1ef27a80/extract-utilities/0.log" Oct 03 19:55:17 crc kubenswrapper[4835]: I1003 19:55:17.243247 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx_dd852437-5bbf-421e-ba98-2a923677a63b/util/0.log" Oct 03 19:55:17 crc kubenswrapper[4835]: I1003 19:55:17.431471 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx_dd852437-5bbf-421e-ba98-2a923677a63b/util/0.log" Oct 03 19:55:17 crc kubenswrapper[4835]: I1003 19:55:17.482234 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx_dd852437-5bbf-421e-ba98-2a923677a63b/pull/0.log" Oct 03 19:55:17 crc kubenswrapper[4835]: I1003 19:55:17.554801 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx_dd852437-5bbf-421e-ba98-2a923677a63b/pull/0.log" Oct 03 19:55:17 crc kubenswrapper[4835]: I1003 19:55:17.822775 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx_dd852437-5bbf-421e-ba98-2a923677a63b/pull/0.log" Oct 03 19:55:17 crc kubenswrapper[4835]: I1003 19:55:17.897396 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx_dd852437-5bbf-421e-ba98-2a923677a63b/extract/0.log" Oct 03 19:55:17 crc kubenswrapper[4835]: I1003 19:55:17.909364 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c2whzx_dd852437-5bbf-421e-ba98-2a923677a63b/util/0.log" Oct 03 19:55:17 crc kubenswrapper[4835]: I1003 19:55:17.934716 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8p8gm_2af817b8-e0e4-43b7-9000-566f1ef27a80/registry-server/0.log" Oct 03 19:55:18 crc kubenswrapper[4835]: I1003 19:55:18.051837 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2lz79_340f6531-0442-4899-87ea-795466615b9b/marketplace-operator/0.log" Oct 03 19:55:18 crc kubenswrapper[4835]: I1003 19:55:18.103556 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sbbwc_5d055144-3a57-4244-ba27-468b77001e54/extract-utilities/0.log" Oct 03 19:55:18 crc kubenswrapper[4835]: I1003 19:55:18.350247 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sbbwc_5d055144-3a57-4244-ba27-468b77001e54/extract-content/0.log" Oct 03 19:55:18 crc kubenswrapper[4835]: I1003 19:55:18.354672 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sbbwc_5d055144-3a57-4244-ba27-468b77001e54/extract-utilities/0.log" Oct 03 19:55:18 crc kubenswrapper[4835]: I1003 19:55:18.402675 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sbbwc_5d055144-3a57-4244-ba27-468b77001e54/extract-content/0.log" Oct 03 19:55:18 crc kubenswrapper[4835]: I1003 19:55:18.588728 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sbbwc_5d055144-3a57-4244-ba27-468b77001e54/extract-content/0.log" Oct 03 19:55:18 crc kubenswrapper[4835]: I1003 19:55:18.636934 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sbbwc_5d055144-3a57-4244-ba27-468b77001e54/extract-utilities/0.log" Oct 03 19:55:18 crc kubenswrapper[4835]: I1003 19:55:18.674468 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z6rtw_20470f97-2625-4ea5-87b4-7eadb2bdf759/extract-utilities/0.log" Oct 03 19:55:18 crc kubenswrapper[4835]: I1003 19:55:18.889392 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z6rtw_20470f97-2625-4ea5-87b4-7eadb2bdf759/extract-utilities/0.log" Oct 03 19:55:18 crc kubenswrapper[4835]: I1003 19:55:18.890361 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-sbbwc_5d055144-3a57-4244-ba27-468b77001e54/registry-server/0.log" Oct 03 19:55:18 crc kubenswrapper[4835]: I1003 19:55:18.960591 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z6rtw_20470f97-2625-4ea5-87b4-7eadb2bdf759/extract-content/0.log" Oct 03 19:55:18 crc kubenswrapper[4835]: I1003 19:55:18.984371 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z6rtw_20470f97-2625-4ea5-87b4-7eadb2bdf759/extract-content/0.log" Oct 03 19:55:19 crc kubenswrapper[4835]: I1003 19:55:19.184515 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z6rtw_20470f97-2625-4ea5-87b4-7eadb2bdf759/extract-content/0.log" Oct 03 19:55:19 crc kubenswrapper[4835]: I1003 19:55:19.193381 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z6rtw_20470f97-2625-4ea5-87b4-7eadb2bdf759/extract-utilities/0.log" Oct 03 19:55:19 crc kubenswrapper[4835]: I1003 19:55:19.969905 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z6rtw_20470f97-2625-4ea5-87b4-7eadb2bdf759/registry-server/0.log" Oct 03 19:55:29 crc kubenswrapper[4835]: I1003 19:55:29.877367 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:55:29 crc kubenswrapper[4835]: E1003 19:55:29.878577 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w4fql_openshift-machine-config-operator(10a8b8e7-c0f5-4c40-b0bd-b52379adae1f)\"" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" Oct 03 19:55:35 crc kubenswrapper[4835]: I1003 19:55:35.113026 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-9x6c4_d0c975ce-2198-4163-b431-7bad685dab35/prometheus-operator/0.log" Oct 03 19:55:35 crc kubenswrapper[4835]: I1003 19:55:35.297221 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5474f66f9-2225x_f7d6c825-42e2-4396-b3c4-b93c3c2f9442/prometheus-operator-admission-webhook/0.log" Oct 03 19:55:35 crc kubenswrapper[4835]: I1003 19:55:35.340537 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5474f66f9-gbz8t_ec28bb25-3e95-4d64-b5b2-7fecfa63db71/prometheus-operator-admission-webhook/0.log" Oct 03 19:55:35 crc kubenswrapper[4835]: I1003 19:55:35.533270 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-lhh2c_4334b8f1-99d4-4676-a64c-68704cfe50a8/operator/0.log" Oct 03 19:55:35 crc kubenswrapper[4835]: I1003 19:55:35.556541 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-cq5g4_477eb3b8-9f75-4dd3-bc90-ec855d242dc8/perses-operator/0.log" Oct 03 19:55:44 crc kubenswrapper[4835]: I1003 19:55:44.877501 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:55:45 crc kubenswrapper[4835]: I1003 19:55:45.252416 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"d15232ed71a2b09b63b522935db02aca917f25067fdb05fffa635d4625c1214d"} Oct 03 19:58:05 crc kubenswrapper[4835]: I1003 19:58:05.359430 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:58:05 crc kubenswrapper[4835]: I1003 19:58:05.360303 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:58:16 crc kubenswrapper[4835]: I1003 19:58:16.399510 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce1a2011-32e6-44ba-840e-d840da2bf0f3" containerID="fa6bba39ac7acb899467058c061ef5eab3302964ea92c0890bd50de167c91594" exitCode=0 Oct 03 19:58:16 crc kubenswrapper[4835]: I1003 19:58:16.399738 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-j7c72/must-gather-68jml" event={"ID":"ce1a2011-32e6-44ba-840e-d840da2bf0f3","Type":"ContainerDied","Data":"fa6bba39ac7acb899467058c061ef5eab3302964ea92c0890bd50de167c91594"} Oct 03 19:58:16 crc kubenswrapper[4835]: I1003 19:58:16.400889 4835 scope.go:117] "RemoveContainer" containerID="fa6bba39ac7acb899467058c061ef5eab3302964ea92c0890bd50de167c91594" Oct 03 19:58:17 crc kubenswrapper[4835]: I1003 19:58:17.369808 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j7c72_must-gather-68jml_ce1a2011-32e6-44ba-840e-d840da2bf0f3/gather/0.log" Oct 03 19:58:25 crc kubenswrapper[4835]: I1003 19:58:25.908630 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-j7c72/must-gather-68jml"] Oct 03 19:58:25 crc kubenswrapper[4835]: I1003 19:58:25.909875 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-j7c72/must-gather-68jml" podUID="ce1a2011-32e6-44ba-840e-d840da2bf0f3" containerName="copy" containerID="cri-o://d1e8a51b6bc0846118d361d712b68ddbd31dc5acdcd360cb57b918fbddaa5c03" gracePeriod=2 Oct 03 19:58:25 crc kubenswrapper[4835]: I1003 19:58:25.923477 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-j7c72/must-gather-68jml"] Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.463773 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j7c72_must-gather-68jml_ce1a2011-32e6-44ba-840e-d840da2bf0f3/copy/0.log" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.465938 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/must-gather-68jml" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.546500 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-j7c72_must-gather-68jml_ce1a2011-32e6-44ba-840e-d840da2bf0f3/copy/0.log" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.547049 4835 generic.go:334] "Generic (PLEG): container finished" podID="ce1a2011-32e6-44ba-840e-d840da2bf0f3" containerID="d1e8a51b6bc0846118d361d712b68ddbd31dc5acdcd360cb57b918fbddaa5c03" exitCode=143 Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.547136 4835 scope.go:117] "RemoveContainer" containerID="d1e8a51b6bc0846118d361d712b68ddbd31dc5acdcd360cb57b918fbddaa5c03" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.547160 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-j7c72/must-gather-68jml" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.575530 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ce1a2011-32e6-44ba-840e-d840da2bf0f3-must-gather-output\") pod \"ce1a2011-32e6-44ba-840e-d840da2bf0f3\" (UID: \"ce1a2011-32e6-44ba-840e-d840da2bf0f3\") " Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.576471 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqpf9\" (UniqueName: \"kubernetes.io/projected/ce1a2011-32e6-44ba-840e-d840da2bf0f3-kube-api-access-zqpf9\") pod \"ce1a2011-32e6-44ba-840e-d840da2bf0f3\" (UID: \"ce1a2011-32e6-44ba-840e-d840da2bf0f3\") " Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.576611 4835 scope.go:117] "RemoveContainer" containerID="fa6bba39ac7acb899467058c061ef5eab3302964ea92c0890bd50de167c91594" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.620409 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce1a2011-32e6-44ba-840e-d840da2bf0f3-kube-api-access-zqpf9" (OuterVolumeSpecName: "kube-api-access-zqpf9") pod "ce1a2011-32e6-44ba-840e-d840da2bf0f3" (UID: "ce1a2011-32e6-44ba-840e-d840da2bf0f3"). InnerVolumeSpecName "kube-api-access-zqpf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.674516 4835 scope.go:117] "RemoveContainer" containerID="d1e8a51b6bc0846118d361d712b68ddbd31dc5acdcd360cb57b918fbddaa5c03" Oct 03 19:58:26 crc kubenswrapper[4835]: E1003 19:58:26.678625 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e8a51b6bc0846118d361d712b68ddbd31dc5acdcd360cb57b918fbddaa5c03\": container with ID starting with d1e8a51b6bc0846118d361d712b68ddbd31dc5acdcd360cb57b918fbddaa5c03 not found: ID does not exist" containerID="d1e8a51b6bc0846118d361d712b68ddbd31dc5acdcd360cb57b918fbddaa5c03" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.678663 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e8a51b6bc0846118d361d712b68ddbd31dc5acdcd360cb57b918fbddaa5c03"} err="failed to get container status \"d1e8a51b6bc0846118d361d712b68ddbd31dc5acdcd360cb57b918fbddaa5c03\": rpc error: code = NotFound desc = could not find container \"d1e8a51b6bc0846118d361d712b68ddbd31dc5acdcd360cb57b918fbddaa5c03\": container with ID starting with d1e8a51b6bc0846118d361d712b68ddbd31dc5acdcd360cb57b918fbddaa5c03 not found: ID does not exist" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.678688 4835 scope.go:117] "RemoveContainer" containerID="fa6bba39ac7acb899467058c061ef5eab3302964ea92c0890bd50de167c91594" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.679586 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqpf9\" (UniqueName: \"kubernetes.io/projected/ce1a2011-32e6-44ba-840e-d840da2bf0f3-kube-api-access-zqpf9\") on node \"crc\" DevicePath \"\"" Oct 03 19:58:26 crc kubenswrapper[4835]: E1003 19:58:26.680312 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa6bba39ac7acb899467058c061ef5eab3302964ea92c0890bd50de167c91594\": container with ID starting with fa6bba39ac7acb899467058c061ef5eab3302964ea92c0890bd50de167c91594 not found: ID does not exist" containerID="fa6bba39ac7acb899467058c061ef5eab3302964ea92c0890bd50de167c91594" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.680338 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa6bba39ac7acb899467058c061ef5eab3302964ea92c0890bd50de167c91594"} err="failed to get container status \"fa6bba39ac7acb899467058c061ef5eab3302964ea92c0890bd50de167c91594\": rpc error: code = NotFound desc = could not find container \"fa6bba39ac7acb899467058c061ef5eab3302964ea92c0890bd50de167c91594\": container with ID starting with fa6bba39ac7acb899467058c061ef5eab3302964ea92c0890bd50de167c91594 not found: ID does not exist" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.837983 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce1a2011-32e6-44ba-840e-d840da2bf0f3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ce1a2011-32e6-44ba-840e-d840da2bf0f3" (UID: "ce1a2011-32e6-44ba-840e-d840da2bf0f3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.885823 4835 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ce1a2011-32e6-44ba-840e-d840da2bf0f3-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 03 19:58:26 crc kubenswrapper[4835]: I1003 19:58:26.895124 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce1a2011-32e6-44ba-840e-d840da2bf0f3" path="/var/lib/kubelet/pods/ce1a2011-32e6-44ba-840e-d840da2bf0f3/volumes" Oct 03 19:58:35 crc kubenswrapper[4835]: I1003 19:58:35.359017 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:58:35 crc kubenswrapper[4835]: I1003 19:58:35.360273 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:59:05 crc kubenswrapper[4835]: I1003 19:59:05.358737 4835 patch_prober.go:28] interesting pod/machine-config-daemon-w4fql container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 19:59:05 crc kubenswrapper[4835]: I1003 19:59:05.361411 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 19:59:05 crc kubenswrapper[4835]: I1003 19:59:05.361506 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" Oct 03 19:59:05 crc kubenswrapper[4835]: I1003 19:59:05.362647 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d15232ed71a2b09b63b522935db02aca917f25067fdb05fffa635d4625c1214d"} pod="openshift-machine-config-operator/machine-config-daemon-w4fql" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 19:59:05 crc kubenswrapper[4835]: I1003 19:59:05.362748 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" podUID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerName="machine-config-daemon" containerID="cri-o://d15232ed71a2b09b63b522935db02aca917f25067fdb05fffa635d4625c1214d" gracePeriod=600 Oct 03 19:59:06 crc kubenswrapper[4835]: I1003 19:59:06.095867 4835 generic.go:334] "Generic (PLEG): container finished" podID="10a8b8e7-c0f5-4c40-b0bd-b52379adae1f" containerID="d15232ed71a2b09b63b522935db02aca917f25067fdb05fffa635d4625c1214d" exitCode=0 Oct 03 19:59:06 crc kubenswrapper[4835]: I1003 19:59:06.096104 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerDied","Data":"d15232ed71a2b09b63b522935db02aca917f25067fdb05fffa635d4625c1214d"} Oct 03 19:59:06 crc kubenswrapper[4835]: I1003 19:59:06.096711 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w4fql" event={"ID":"10a8b8e7-c0f5-4c40-b0bd-b52379adae1f","Type":"ContainerStarted","Data":"ed6c36f024443fe607eb841a62b8db54fc9d74262471ca1b85e2e633bdf2b553"} Oct 03 19:59:06 crc kubenswrapper[4835]: I1003 19:59:06.096743 4835 scope.go:117] "RemoveContainer" containerID="0ae87b079abe58979f3e5ed425a4bccb3241e251b63aa7c575550ae11a209385" Oct 03 19:59:46 crc kubenswrapper[4835]: I1003 19:59:46.188994 4835 scope.go:117] "RemoveContainer" containerID="1896120bec1f9e369ea27e31e08b60a5f9c11448e0e5d6720059f4a644b346cf" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.179834 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q"] Oct 03 20:00:00 crc kubenswrapper[4835]: E1003 20:00:00.181373 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314c5c33-739b-4996-b64d-d21be1fb411e" containerName="registry-server" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.181401 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="314c5c33-739b-4996-b64d-d21be1fb411e" containerName="registry-server" Oct 03 20:00:00 crc kubenswrapper[4835]: E1003 20:00:00.181430 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314c5c33-739b-4996-b64d-d21be1fb411e" containerName="extract-content" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.181441 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="314c5c33-739b-4996-b64d-d21be1fb411e" containerName="extract-content" Oct 03 20:00:00 crc kubenswrapper[4835]: E1003 20:00:00.181497 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce1a2011-32e6-44ba-840e-d840da2bf0f3" containerName="gather" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.181510 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce1a2011-32e6-44ba-840e-d840da2bf0f3" containerName="gather" Oct 03 20:00:00 crc kubenswrapper[4835]: E1003 20:00:00.181531 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce1a2011-32e6-44ba-840e-d840da2bf0f3" containerName="copy" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.181542 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce1a2011-32e6-44ba-840e-d840da2bf0f3" containerName="copy" Oct 03 20:00:00 crc kubenswrapper[4835]: E1003 20:00:00.181571 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be514a5b-7674-404e-ad18-2cd515cc6c57" containerName="container-00" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.181583 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="be514a5b-7674-404e-ad18-2cd515cc6c57" containerName="container-00" Oct 03 20:00:00 crc kubenswrapper[4835]: E1003 20:00:00.181609 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314c5c33-739b-4996-b64d-d21be1fb411e" containerName="extract-utilities" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.181620 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="314c5c33-739b-4996-b64d-d21be1fb411e" containerName="extract-utilities" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.181953 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce1a2011-32e6-44ba-840e-d840da2bf0f3" containerName="gather" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.181974 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="be514a5b-7674-404e-ad18-2cd515cc6c57" containerName="container-00" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.182008 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce1a2011-32e6-44ba-840e-d840da2bf0f3" containerName="copy" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.182052 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="314c5c33-739b-4996-b64d-d21be1fb411e" containerName="registry-server" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.183420 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.186585 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.187116 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.200338 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q"] Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.244353 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktmph\" (UniqueName: \"kubernetes.io/projected/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-kube-api-access-ktmph\") pod \"collect-profiles-29325360-4529q\" (UID: \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.244550 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-secret-volume\") pod \"collect-profiles-29325360-4529q\" (UID: \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.244585 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-config-volume\") pod \"collect-profiles-29325360-4529q\" (UID: \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.346956 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-secret-volume\") pod \"collect-profiles-29325360-4529q\" (UID: \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.347040 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-config-volume\") pod \"collect-profiles-29325360-4529q\" (UID: \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.347176 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktmph\" (UniqueName: \"kubernetes.io/projected/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-kube-api-access-ktmph\") pod \"collect-profiles-29325360-4529q\" (UID: \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.348740 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-config-volume\") pod \"collect-profiles-29325360-4529q\" (UID: \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.356611 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-secret-volume\") pod \"collect-profiles-29325360-4529q\" (UID: \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.378372 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktmph\" (UniqueName: \"kubernetes.io/projected/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-kube-api-access-ktmph\") pod \"collect-profiles-29325360-4529q\" (UID: \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.507375 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" Oct 03 20:00:00 crc kubenswrapper[4835]: I1003 20:00:00.825602 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q"] Oct 03 20:00:00 crc kubenswrapper[4835]: W1003 20:00:00.832988 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9b5e2ad_c8b9_4b7e_ae38_79edd077f1ab.slice/crio-190e7af48808e3cdc9437d47a9c35b6ab440c3ca7ee760e03f8e2e315c02934c WatchSource:0}: Error finding container 190e7af48808e3cdc9437d47a9c35b6ab440c3ca7ee760e03f8e2e315c02934c: Status 404 returned error can't find the container with id 190e7af48808e3cdc9437d47a9c35b6ab440c3ca7ee760e03f8e2e315c02934c Oct 03 20:00:01 crc kubenswrapper[4835]: I1003 20:00:01.842644 4835 generic.go:334] "Generic (PLEG): container finished" podID="b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab" containerID="3f0e260ba915ef862660d27f1993484beac4db5ece6c3fee14bf916e468dea48" exitCode=0 Oct 03 20:00:01 crc kubenswrapper[4835]: I1003 20:00:01.842762 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" event={"ID":"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab","Type":"ContainerDied","Data":"3f0e260ba915ef862660d27f1993484beac4db5ece6c3fee14bf916e468dea48"} Oct 03 20:00:01 crc kubenswrapper[4835]: I1003 20:00:01.844431 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" event={"ID":"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab","Type":"ContainerStarted","Data":"190e7af48808e3cdc9437d47a9c35b6ab440c3ca7ee760e03f8e2e315c02934c"} Oct 03 20:00:03 crc kubenswrapper[4835]: I1003 20:00:03.260298 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" Oct 03 20:00:03 crc kubenswrapper[4835]: I1003 20:00:03.321170 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-secret-volume\") pod \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\" (UID: \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\") " Oct 03 20:00:03 crc kubenswrapper[4835]: I1003 20:00:03.321386 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-config-volume\") pod \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\" (UID: \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\") " Oct 03 20:00:03 crc kubenswrapper[4835]: I1003 20:00:03.321451 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktmph\" (UniqueName: \"kubernetes.io/projected/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-kube-api-access-ktmph\") pod \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\" (UID: \"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab\") " Oct 03 20:00:03 crc kubenswrapper[4835]: I1003 20:00:03.322875 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-config-volume" (OuterVolumeSpecName: "config-volume") pod "b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab" (UID: "b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 20:00:03 crc kubenswrapper[4835]: I1003 20:00:03.330948 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-kube-api-access-ktmph" (OuterVolumeSpecName: "kube-api-access-ktmph") pod "b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab" (UID: "b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab"). InnerVolumeSpecName "kube-api-access-ktmph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 20:00:03 crc kubenswrapper[4835]: I1003 20:00:03.333409 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab" (UID: "b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 20:00:03 crc kubenswrapper[4835]: I1003 20:00:03.425092 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 20:00:03 crc kubenswrapper[4835]: I1003 20:00:03.425145 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 20:00:03 crc kubenswrapper[4835]: I1003 20:00:03.425161 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktmph\" (UniqueName: \"kubernetes.io/projected/b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab-kube-api-access-ktmph\") on node \"crc\" DevicePath \"\"" Oct 03 20:00:03 crc kubenswrapper[4835]: I1003 20:00:03.872388 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" Oct 03 20:00:03 crc kubenswrapper[4835]: I1003 20:00:03.872237 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29325360-4529q" event={"ID":"b9b5e2ad-c8b9-4b7e-ae38-79edd077f1ab","Type":"ContainerDied","Data":"190e7af48808e3cdc9437d47a9c35b6ab440c3ca7ee760e03f8e2e315c02934c"} Oct 03 20:00:03 crc kubenswrapper[4835]: I1003 20:00:03.872568 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="190e7af48808e3cdc9437d47a9c35b6ab440c3ca7ee760e03f8e2e315c02934c" Oct 03 20:00:04 crc kubenswrapper[4835]: I1003 20:00:04.366442 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v"] Oct 03 20:00:04 crc kubenswrapper[4835]: I1003 20:00:04.376562 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29325315-b2k8v"] Oct 03 20:00:04 crc kubenswrapper[4835]: I1003 20:00:04.890658 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04d298f3-24fe-4e66-8a43-e72c3de44ddf" path="/var/lib/kubelet/pods/04d298f3-24fe-4e66-8a43-e72c3de44ddf/volumes"